WorldWideScience

Sample records for rigorous scientific testing

  1. Scientific rigor through videogames.

    Science.gov (United States)

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Standards for Radiation Effects Testing: Ensuring Scientific Rigor in the Face of Budget Realities and Modern Device Challenges

    Science.gov (United States)

    Lauenstein, J M.

    2015-01-01

    An overview is presented of the space radiation environment and its effects on electrical, electronic, and electromechanical parts. Relevant test standards and guidelines are listed. Test standards and guidelines are necessary to ensure best practices, minimize and bound systematic and random errors, and to ensure comparable results from different testers and vendors. Test standards are by their nature static but exist in a dynamic environment of advancing technology and radiation effects research. New technologies, failure mechanisms, and advancement in our understanding of known failure mechanisms drive the revision or development of test standards. Changes to standards must be weighed against their impact on cost and existing part qualifications. There must be consensus on new best practices. The complexity of some new technologies exceeds the scope of existing test standards and may require development of a guideline specific to the technology. Examples are given to illuminate the value and limitations of key radiation test standards as well as the challenges in keeping these standards up to date.

  3. Increased scientific rigor will improve reliability of research and effectiveness of management

    Science.gov (United States)

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and

  4. Imagination and rigor essays on Eduardo R Caianiello's scientific heritage

    CERN Document Server

    Termini, Settimo

    2006-01-01

    The aim of this Volume of scientific essays is twofold. From one side, by remembering the scientific figure of Eduardo R. Caianiello, it aims at focusing his outstanding contributions - from theoretical physics to cybernetics - which after so many years still represent occasion of innovative paths to be fruitfully followed. It must be stressed the contribution that his interdisciplinary methodology can still be of great help in affording and solving present day complex problems. On the other side, it aims at pinpointing with the help of the scientists contributing to the Volume - some crucial problems in present day research in the fields of interest of Eduardo Caianiello and which are still among the main lines of investigation of some of the Istitutes founded by Eduardo (Istituto di Cibernetica del CNR, IIAS, etc).

  5. Karl Pearson and eugenics: personal opinions and scientific rigor.

    Science.gov (United States)

    Delzell, Darcie A P; Poliak, Cathy D

    2013-09-01

    The influence of personal opinions and biases on scientific conclusions is a threat to the advancement of knowledge. Expertise and experience does not render one immune to this temptation. In this work, one of the founding fathers of statistics, Karl Pearson, is used as an illustration of how even the most talented among us can produce misleading results when inferences are made without caution or reference to potential bias and other analysis limitations. A study performed by Pearson on British Jewish schoolchildren is examined in light of ethical and professional statistical practice. The methodology used and inferences made by Pearson and his coauthor are sometimes questionable and offer insight into how Pearson's support of eugenics and his own British nationalism could have potentially influenced his often careless and far-fetched inferences. A short background into Pearson's work and beliefs is provided, along with an in-depth examination of the authors' overall experimental design and statistical practices. In addition, portions of the study regarding intelligence and tuberculosis are discussed in more detail, along with historical reactions to their work.

  6. Revisiting the scientific method to improve rigor and reproducibility of immunohistochemistry in reproductive science.

    Science.gov (United States)

    Manuel, Sharrón L; Johnson, Brian W; Frevert, Charles W; Duncan, Francesca E

    2018-04-21

    Immunohistochemistry (IHC) is a robust scientific tool whereby cellular components are visualized within a tissue, and this method has been and continues to be a mainstay for many reproductive biologists. IHC is highly informative if performed and interpreted correctly, but studies have shown that the general use and reporting of appropriate controls in IHC experiments is low. This omission of the scientific method can result in data that lacks rigor and reproducibility. In this editorial, we highlight key concepts in IHC controls and describe an opportunity for our field to partner with the Histochemical Society to adopt their IHC guidelines broadly as researchers, authors, ad hoc reviewers, editorial board members, and editors-in-chief. Such cross-professional society interactions will ensure that we produce the highest quality data as new technologies emerge that still rely upon the foundations of classic histological and immunohistochemical principles.

  7. Hypothesis testing of scientific Monte Carlo calculations

    Science.gov (United States)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  8. Bringing scientific rigor to community-developed programs in Hong Kong

    Directory of Open Access Journals (Sweden)

    Fabrizio Cecilia S

    2012-12-01

    Full Text Available Abstract Background This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR. Methods The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Results Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. Conclusions The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  9. Bringing scientific rigor to community-developed programs in Hong Kong.

    Science.gov (United States)

    Fabrizio, Cecilia S; Hirschmann, Malia R; Lam, Tai Hing; Cheung, Teresa; Pang, Irene; Chan, Sophia; Stewart, Sunita M

    2012-12-31

    This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR). The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  10. A rigorous test for a new conceptual model for collisions

    International Nuclear Information System (INIS)

    Peixoto, E.M.A.; Mu-Tao, L.

    1979-01-01

    A rigorous theoretical foundation for the previously proposed model is formulated and applied to electron scattering by H 2 in the gas phase. An rigorous treatment of the interaction potential between the incident electron and the Hydrogen molecule is carried out to calculate Differential Cross Sections for 1 KeV electrons, using Glauber's approximation Wang's molecular wave function for the ground electronic state of H 2 . Moreover, it is shown for the first time that, when adequately done, the omission of two center terms does not adversely influence the results of molecular calculations. It is shown that the new model is far superior to the Independent Atom Model (or Independent Particle Model). The accuracy and simplicity of the new model suggest that it may be fruitfully applied to the description of other collision phenomena (e.g., in molecular beam experiments and nuclear physics). A new techniques is presented for calculations involving two center integrals within the frame work of the Glauber's approximation for scattering. (Author) [pt

  11. Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor

    Science.gov (United States)

    Nathues, Christina; Würbel, Hanno

    2016-01-01

    animal experiments are lacking important information about experimental conduct that determines the scientific validity of the findings, which may be critical for the weight attributed to the benefit of the research in the harm–benefit analysis. Similar to manuscripts getting accepted for publication despite poor reporting of measures against bias, applications for animal experiments may often be approved based on implicit confidence rather than explicit evidence of scientific rigor. Our findings shed serious doubt on the current authorization procedure for animal experiments, as well as the peer-review process for scientific publications, which in the long run may undermine the credibility of research. Developing existing authorization procedures that are already in place in many countries towards a preregistration system for animal research is one promising way to reform the system. This would not only benefit the scientific validity of findings from animal experiments but also help to avoid unnecessary harm to animals for inconclusive research. PMID:27911892

  12. Rigorously testing multialternative decision field theory against random utility models.

    Science.gov (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  13. Rigor, Reliability, and Scientific Relevance: Citizen Science Lessons from COASST (Invited)

    Science.gov (United States)

    Parrish, J. K.

    2013-12-01

    Citizen science promises fine grain, broad extent data collected over decadal time scales, with co-benefits including increased scientific literacy and civic engagement. But does it only deliver non-standardized, unverifiable data collected episodically by individuals with little-to-no training? How do you know which projects to trust? What are the attributes of a scientifically sound citizen science project? The Coastal Observation and Seabird Survey Team (COASST) is a 15 year old citizen science project currently involving ~800 participants from northern California north to Kotzebue, Alaska and west to the Commander Islands, Russia. After a single 5-hour training delivered in-community by an expert, volunteers have the knowledge and skill sets to accurately survey a coastal site for beached bird carcasses, which they will be able to identify to species correctly ~85% of the time. Data are collected monthly, and some volunteers remain with the program for years, contributing hundreds, even thousands, of survey hours. COASST trainings, data collection materials, and data entry web portal all reinforce 'evidence first, deduction second,' a maxim that allows volunteers to learn, and gives on-staff experts the ability to independently verify all birds found. COASST data go directly into science, as part of studies as diverse as fishery entanglement, historic native uses of seabirds as food sources, and the impacts of sudden shifts in upwelling; as well as into resource management, as part of decisions on fishing regulations, waterfowl hunting limits, and ESA-listed species management. Like professional science, COASST features a specific sampling design linked to questions of interest, verifiable data, statistical analysis, and peer-reviewed publication. In addition, COASST features before-and-after testing of volunteer knowledge, independent verification of all deductive data, and recruitment and retention strategies linked to geographic community norms. As a result

  14. EPIC: A Testbed for Scientifically Rigorous Cyber-Physical Security Experimentation

    OpenAIRE

    SIATERLIS CHRISTOS; GENGE BELA; HOHENADEL MARC

    2013-01-01

    Recent malware, like Stuxnet and Flame, constitute a major threat to Networked Critical Infrastructures (NCIs), e.g., power plants. They revealed several vulnerabilities in today's NCIs, but most importantly they highlighted the lack of an efficient scientific approach to conduct experiments that measure the impact of cyber threats on both the physical and the cyber parts of NCIs. In this paper we present EPIC, a novel cyber-physical testbed and a modern scientific instrument that can pr...

  15. The Researchers' View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research.

    Science.gov (United States)

    Reichlin, Thomas S; Vogt, Lucile; Würbel, Hanno

    2016-01-01

    Reproducibility in animal research is alarmingly low, and a lack of scientific rigor has been proposed as a major cause. Systematic reviews found low reporting rates of measures against risks of bias (e.g., randomization, blinding), and a correlation between low reporting rates and overstated treatment effects. Reporting rates of measures against bias are thus used as a proxy measure for scientific rigor, and reporting guidelines (e.g., ARRIVE) have become a major weapon in the fight against risks of bias in animal research. Surprisingly, animal scientists have never been asked about their use of measures against risks of bias and how they report these in publications. Whether poor reporting reflects poor use of such measures, and whether reporting guidelines may effectively reduce risks of bias has therefore remained elusive. To address these questions, we asked in vivo researchers about their use and reporting of measures against risks of bias and examined how self-reports relate to reporting rates obtained through systematic reviews. An online survey was sent out to all registered in vivo researchers in Switzerland (N = 1891) and was complemented by personal interviews with five representative in vivo researchers to facilitate interpretation of the survey results. Return rate was 28% (N = 530), of which 302 participants (16%) returned fully completed questionnaires that were used for further analysis. According to the researchers' self-report, they use measures against risks of bias to a much greater extent than suggested by reporting rates obtained through systematic reviews. However, the researchers' self-reports are likely biased to some extent. Thus, although they claimed to be reporting measures against risks of bias at much lower rates than they claimed to be using these measures, the self-reported reporting rates were considerably higher than reporting rates found by systematic reviews. Furthermore, participants performed rather poorly when asked to

  16. Integrating entertainment and scientific rigor to facilitate a co-creation of knowledge

    Science.gov (United States)

    Hezel, Bernd; Broschkowski, Ephraim; Kropp, Jürgen

    2013-04-01

    The advancing research on the changing climate system and on its impacts has uncovered the magnitude of the expectable societal implications. It therefore created substantial awareness of the problem with stakeholders and the general public. But despite this awareness, unsustainable trends have continued untamed. For a transition towards a sustainable world it is, apparently, not enough to disseminate the "scientific truth" and wait for the people to "understand". In order to remedy this problem it is rather necessary to develop new entertaining formats to communicate the complex topic in an integrated and comprehensive way. Beyond that, it could be helpful to acknowledge that science can only generate part of the knowledge that is necessary for the transformation. The nature of the problem and its deep societal implications call for a co-creation of knowledge by science and society in order to enable change. In this spirit the RAMSES project (Reconciling Adaptation, Mitigation and Sustainable Development for Cities) follows a dialogic communication approach allowing for a co-formulation of research questions by stakeholders. A web-based audio-visual guidance application presents embedded scientific information in an entertaining and intuitive way on the basis of a "complexity on demand" approach. It aims at enabling decision making despite uncertainty and it entails a reframing of the project's research according to applied and local knowledge.

  17. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors

    Directory of Open Access Journals (Sweden)

    Spiros Pagiatakis

    2009-10-01

    Full Text Available In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times. It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF. It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at −40 °C, −20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  18. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    Science.gov (United States)

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  19. Testing Scientific Software: A Systematic Literature Review

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  20. Testing Scientific Software: A Systematic Literature Review.

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  1. What do we mean? On the importance of not abandoning scientific rigor when talking about science education.

    Science.gov (United States)

    Klahr, David

    2013-08-20

    Although the "science of science communication" usually refers to the flow of scientific knowledge from scientists to the public, scientists direct most of their communications not to the public, but instead to other scientists in their field. This paper presents a case study on this understudied type of communication: within a discipline, among its practitioners. I argue that many of the contentious disagreements that exist today in the field in which I conduct my research--early science education--derive from a lack of operational definitions, such that when competing claims are made for the efficacy of one type of science instruction vs. another, the arguments are hopelessly disjointed. The aim of the paper is not to resolve the current claims and counterclaims about the most effective pedagogies in science education, but rather to note that the assessment of one approach vs. the other is all too often defended on the basis of strongly held beliefs, rather than on the results of replicable experiments, designed around operational definitions of the teaching methods being investigated. A detailed example of operational definitions from my own research on elementary school science instruction is provided. In addition, the paper addresses the issue of how casual use of labels-both within the discipline and when communicating with the public-may inadvertently "undo" the benefits of operational definitions.

  2. Future Low Temperature Plasma Science and Technology: Attacking Major Societal Problems by Building on a Tradition of Scientific Rigor

    Science.gov (United States)

    Graves, David

    2014-10-01

    Low temperature plasma (LTP) science is unequivocally one of the most prolific areas for varied applications in modern technology. For example, plasma etching technology is essential for reliably and rapidly patterning nanometer scale features over areas approaching one square meter with relatively inexpensive equipment. This technology enabled the telecommunication and information processing revolution that has transformed human society. I explore two concepts in this talk. The first is that the firm scientific understanding of LTP is and has been the enabling feature of these established technological applications. And the second is that LTP technology is poised to contribute to several emerging societal challenges. Beyond the important, ongoing applications of LTP science to problems of materials processing related to energy generation (e.g. thin film solar cell manufacture), there are novel and less well known potential applications in food and agriculture, infection control and medicine. In some cases, the potentially low cost nature of the applications in so compelling that they can be thought of as examples of frugal innovation. Supported in part by NSF and DoE.

  3. Test Driven Development of Scientific Models

    Science.gov (United States)

    Clune, Thomas L.

    2012-01-01

    Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.

  4. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    Science.gov (United States)

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  5. Test Driven Development of Scientific Models

    Science.gov (United States)

    Clune, Thomas L.

    2014-01-01

    Test-Driven Development (TDD), a software development process that promises many advantages for developer productivity and software reliability, has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices.After a brief overview of the TDD process and my experience in applying the methodology for development activities at Goddard, I will delve more deeply into some of the challenges that are posed by numerical and scientific software as well as tools and implementation approaches that should address those challenges.

  6. Advanced Test Reactor National Scientific User Facility

    International Nuclear Information System (INIS)

    Marshall, Frances M.; Benson, Jeff; Thelen, Mary Catherine

    2011-01-01

    The Advanced Test Reactor (ATR), at the Idaho National Laboratory (INL), is a large test reactor for providing the capability for studying the effects of intense neutron and gamma radiation on reactor materials and fuels. The ATR is a pressurized, light-water, high flux test reactor with a maximum operating power of 250 MWth. The INL also has several hot cells and other laboratories in which irradiated material can be examined to study material irradiation effects. In 2007 the US Department of Energy (DOE) designated the ATR as a National Scientific User Facility (NSUF) to facilitate greater access to the ATR and the associated INL laboratories for material testing research by a broader user community. This paper highlights the ATR NSUF research program and the associated educational initiatives.

  7. Advanced Test Reactor National Scientific User Facility

    Energy Technology Data Exchange (ETDEWEB)

    Frances M. Marshall; Jeff Benson; Mary Catherine Thelen

    2011-08-01

    The Advanced Test Reactor (ATR), at the Idaho National Laboratory (INL), is a large test reactor for providing the capability for studying the effects of intense neutron and gamma radiation on reactor materials and fuels. The ATR is a pressurized, light-water, high flux test reactor with a maximum operating power of 250 MWth. The INL also has several hot cells and other laboratories in which irradiated material can be examined to study material irradiation effects. In 2007 the US Department of Energy (DOE) designated the ATR as a National Scientific User Facility (NSUF) to facilitate greater access to the ATR and the associated INL laboratories for material testing research by a broader user community. This paper highlights the ATR NSUF research program and the associated educational initiatives.

  8. Scientific issues in drug testing: council on scientific affairs

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Testing for drugs in biologic fluids, especially urine, is a practice that has become widespread. The technology of testing for drugs in urine has greatly improved in recent years. Inexpensive screening techniques are not sufficiently accurate for forensic testing standards, which must be met wihen a person's employment or reputation may be affected by results. This is particularly a concern during screening of a population in which the prevalence of drug use is very low, in which the predictive value of a positive result would be quite low. Physicians should be aware that results from drug testing can yield accurate evidence of prior exposure to drugs, but they do not provide information about patterns of drug use, about abuse of or dependence on drugs, or about mental or physical impairments that may result from drug use

  9. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    Science.gov (United States)

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  10. Rigorous Science: a How-To Guide

    Directory of Open Access Journals (Sweden)

    Arturo Casadevall

    2016-11-01

    Full Text Available Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education.

  11. Rigorous noise test and calibration check of strong-motion instrumentation at the Conrad Observatory in Austria.

    Science.gov (United States)

    Steiner, R.; Costa, G.; Lenhardt, W.; Horn, N.; Suhadolc, P.

    2012-04-01

    In the framework of the European InterregIV Italy/Austria project: "HAREIA - Historical and Recent Earthquakes in Italy and Austria" the Central Institute for Meteorology and Geodynamics (ZAMG) and Mathematic and Geosciences Department of University of Trieste (DMG) are upgrading the transfrontier seismic network of South-Eastern Alps with new 12 accelerometric stations to enhance the strong motion instrument density near the Austria/Italy border. Various public institutions of the provinces Alto Adige (Bolzano Province), Veneto (ARPAV) and Friuli Venezia Giulia (Regional Civil Defense) in Italy and in the Austrian province of Tyrol are involved in the project. The site selection was carried out to improve the present local network geometry thus meeting the needs of public Institutions in the involved regions. In Tyrol and Alto Adige some strategic buildings (hospitals and public buildings) have been selected, whereas in Veneto and Friuli Venezia Giulia the sites are in the free field, mainly located near villages. The instruments will be installed in an innovative box, designed by ZAMG, that provides electric and water isolation. The common choice regarding the instrument selection has been the new Kinemetrics Basalt ® accelerograph to guarantee homogeneity with the already installed instrumentation and compatibility with the software already in use at the different seismic institutions in the area. Prior to deployment the equipment was tested at the Conrad Observatory and a common set-up has been devised. The Conrad Observatory, seismically particularly quiet, permits to analyze both the sensor and the acquisition system noise. The instruments were connected to the network and the data sent in real-time to the ZAMG data center in Vienna and the DMG data center in Trieste. The data have been collected in the database and analyzed using signal processing modules PQLX and Matlab. The data analysis of the recordings at the ultra-quiet Conrad Observatory pointed out

  12. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  13. Scientific investigation plan for initial engineered barrier system field tests

    International Nuclear Information System (INIS)

    Wunan Lin.

    1993-02-01

    The purpose of this Scientific Investigation Plan (SIP) is to describe tests known as Initial Engineered Barrier System Field Tests (IEBSFT) and identified by Work Breakdown Structure as WBS 1.2.2.2.4. The IEBSFT are precursors to the Engineered Barrier System Field Test (EBSFT), WBS 1.2.2.2.4, to be conducted in the Exploratory Study Facility (ESF) at Yucca Mountain. The EBSFT and IEBSFT are designed to provide information on the interaction between waste packages (simulated by heated containers) and the surrounding rock mass, its vadose water, and infiltrated water. Heater assemblies will be installed in drifts or boreholes openings and heated to measure moisture movement during heat-up and subsequent cool-down of the rock mass. In some of the tests, infiltration of water into the heated rock mass will be studied. Throughout the heating and cooling cycle, instruments installed in the rock will monitor such parameters as temperature, moisture content, concentration of some chemical species, and stress and strain. Rock permeability measurements, rock and fluid (water and gas) sampling, and fracture pattern measurements will also be made before and after the test

  14. HOW TO SELECT APPROPRIATE STATISTICAL TEST IN SCIENTIFIC ARTICLES

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-09-01

    Full Text Available Statistics is mathematical science dealing with the collection, analysis, interpretation, and presentation of masses of numerical data in order to draw relevant conclusions. Statistics is a form of mathematical analysis that uses quantified models, representations and synopses for a given set of experimental data or real-life studies. The students and young researchers in biomedical sciences and in special education and rehabilitation often declare that they have chosen to enroll that study program because they have lack of knowledge or interest in mathematics. This is a sad statement, but there is much truth in it. The aim of this editorial is to help young researchers to select statistics or statistical techniques and statistical software appropriate for the purposes and conditions of a particular analysis. The most important statistical tests are reviewed in the article. Knowing how to choose right statistical test is an important asset and decision in the research data processing and in the writing of scientific papers. Young researchers and authors should know how to choose and how to use statistical methods. The competent researcher will need knowledge in statistical procedures. That might include an introductory statistics course, and it most certainly includes using a good statistics textbook. For this purpose, there is need to return of Statistics mandatory subject in the curriculum of the Institute of Special Education and Rehabilitation at Faculty of Philosophy in Skopje. Young researchers have a need of additional courses in statistics. They need to train themselves to use statistical software on appropriate way.

  15. Advanced Test Reactor National Scientific User Facility Partnerships

    International Nuclear Information System (INIS)

    Marshall, Frances M.; Allen, Todd R.; Benson, Jeff B.; Cole, James I.; Thelen, Mary Catherine

    2012-01-01

    In 2007, the United States Department of Energy designated the Advanced Test Reactor (ATR), located at Idaho National Laboratory, as a National Scientific User Facility (NSUF). This designation made test space within the ATR and post-irradiation examination (PIE) equipment at INL available for use by researchers via a proposal and peer review process. The goal of the ATR NSUF is to provide researchers with the best ideas access to the most advanced test capability, regardless of the proposer's physical location. Since 2007, the ATR NSUF has expanded its available reactor test space, and obtained access to additional PIE equipment. Recognizing that INL may not have all the desired PIE equipment, or that some equipment may become oversubscribed, the ATR NSUF established a Partnership Program. This program enables and facilitates user access to several university and national laboratories. So far, seven universities and one national laboratory have been added to the ATR NSUF with capability that includes reactor-testing space, PIE equipment, and ion beam irradiation facilities. With the addition of these universities, irradiation can occur in multiple reactors and post-irradiation exams can be performed at multiple universities. In each case, the choice of facilities is based on the user's technical needs. Universities and laboratories included in the ATR NSUF partnership program are as follows: (1) Nuclear Services Laboratories at North Carolina State University; (2) PULSTAR Reactor Facility at North Carolina State University; (3) Michigan Ion Beam Laboratory (1.7 MV Tandetron accelerator) at the University of Michigan; (4) Irradiated Materials at the University of Michigan; (5) Harry Reid Center Radiochemistry Laboratories at University of Nevada, Las Vegas; (6) Characterization Laboratory for Irradiated Materials at the University of Wisconsin-Madison; (7) Tandem Accelerator Ion Beam. (1.7 MV terminal voltage tandem ion accelerator) at the University of Wisconsin

  16. Advanced Test Reactor National Scientific User Facility 2010 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Mary Catherine Thelen; Todd R. Allen

    2011-05-01

    This is the 2010 ATR National Scientific User Facility Annual Report. This report provides an overview of the program for 2010, along with individual project reports from each of the university principal investigators. The report also describes the capabilities offered to university researchers here at INL and at the ATR NSUF partner facilities.

  17. Scientific issues related to the cytology proficiency testing regulations

    Directory of Open Access Journals (Sweden)

    Prey Marianne

    2006-01-01

    Full Text Available Abstract The member organizations of the Cytology Education and Technology Consortium believe there are significant flaws in current cytology proficiency testing regulations. The most immediate needed modifications include lengthening the required testing interval, utilizing stringently validated and continuously monitored slides, changing the grading scheme, and changing the focus of the test from the individual to laboratory level testing. Integration of new computer-assisted and located-guided screening technologies into the testing protocols is necessary for the testing protocol to be compliant with the law.

  18. Generating pseudo test collections for learning to rank scientific articles

    NARCIS (Netherlands)

    Berendsen, R.; Tsagkias, M.; de Rijke, M.; Meij, E.

    2012-01-01

    Pseudo test collections are automatically generated to provide training material for learning to rank methods. We propose a method for generating pseudo test collections in the domain of digital libraries, where data is relatively sparse, but comes with rich annotations. Our intuition is that

  19. Putrefactive rigor: apparent rigor mortis due to gas distension.

    Science.gov (United States)

    Gill, James R; Landi, Kristen

    2011-09-01

    Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.

  20. WASP (Write a Scientific Paper) using Excel - 11: Test characteristics.

    Science.gov (United States)

    Grech, Victor

    2018-07-01

    The calculation of various test characteristics may be required as part of a data analysis exercise. This paper explains how to set up these calculations in Microsoft Excel in order to obtain sensitivity, specificity, positive and negative predictive values, diagnostic accuracy and prevalence. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Mathematical Rigor in Introductory Physics

    Science.gov (United States)

    Vandyke, Michael; Bassichis, William

    2011-10-01

    Calculus-based introductory physics courses intended for future engineers and physicists are often designed and taught in the same fashion as those intended for students of other disciplines. A more mathematically rigorous curriculum should be more appropriate and, ultimately, more beneficial for the student in his or her future coursework. This work investigates the effects of mathematical rigor on student understanding of introductory mechanics. Using a series of diagnostic tools in conjunction with individual student course performance, a statistical analysis will be performed to examine student learning of introductory mechanics and its relation to student understanding of the underlying calculus.

  2. How Individual Scholars Can Reduce the Rigor-Relevance Gap in Management Research

    OpenAIRE

    Wolf, Joachim; Rosenberg, Timo

    2012-01-01

    This paper discusses a number of avenues management scholars could follow to reduce the existing gap between scientific rigor and practical relevance without relativizing the importance of the first goal dimension. Such changes are necessary because many management studies do not fully exploit the possibilities to increase their practical relevance while maintaining scientific rigor. We argue that this rigor-relevance gap is not only the consequence of the currently prevailing institutional c...

  3. A case of instantaneous rigor?

    Science.gov (United States)

    Pirch, J; Schulz, Y; Klintschar, M

    2013-09-01

    The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.

  4. The Analysis of Students Scientific Reasoning Ability in Solving the Modified Lawson Classroom Test of Scientific Reasoning (MLCTSR Problems by Applying the Levels of Inquiry

    Directory of Open Access Journals (Sweden)

    N. Novia

    2017-04-01

    Full Text Available This study aims to determine the students’ achievement in answering modified lawson classroom test of scientific reasoning (MLCTSR questions in overall science teaching and by every aspect of scientific reasoning abilities. There are six aspects related to the scientific reasoning abilities that were measured; they are conservatorial reasoning, proportional reasoning, controlling variables, combinatorial reasoning, probabilistic reasoning, correlational reasoning. The research is also conducted to see the development of scientific reasoning by using levels of inquiry models. The students reasoning ability was measured using the Modified Lawson Classroom Test of Scientific Reasoning (MLCTSR. MLCTSR is a test developed based on the test of scientific reasoning of Lawson’s Classroom Test of Scientific Reasoning (LCTSR in 2000 which amounted to 12 multiple-choice questions. The research method chosen in this study is descriptive quantitative research methods. The research design used is One Group Pretest-Posttest Design. The population of this study is the entire junior high students class VII the academic year 2014/2015 in one junior high school in Bandung. The samples in this study are one of class VII, which is class VII C. The sampling method used in this research is purposive sampling. The results showed that there is an increase in quantitative scientific reasoning although its value is not big.

  5. Long persistence of rigor mortis at constant low temperature.

    Science.gov (United States)

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  6. Relevant climate response tests for stratospheric aerosol injection: A combined ethical and scientific analysis

    Science.gov (United States)

    Lenferna, Georges Alexandre; Russotto, Rick D.; Tan, Amanda; Gardiner, Stephen M.; Ackerman, Thomas P.

    2017-06-01

    In this paper, we focus on stratospheric sulfate injection as a geoengineering scheme, and provide a combined scientific and ethical analysis of climate response tests, which are a subset of outdoor tests that would seek to impose detectable and attributable changes to climate variables on global or regional scales. We assess the current state of scientific understanding on the plausibility and scalability of climate response tests. Then, we delineate a minimal baseline against which to consider whether certain climate response tests would be relevant for a deployment scenario. Our analysis shows that some climate response tests, such as those attempting to detect changes in regional climate impacts, may not be deployable in time periods relevant to realistic geoengineering scenarios. This might pose significant challenges for justifying stratospheric sulfate aerosol injection deployment overall. We then survey some of the major ethical challenges that proposed climate response tests face. We consider what levels of confidence would be required to ethically justify approving a proposed test; whether the consequences of tests are subject to similar questions of justice, compensation, and informed consent as full-scale deployment; and whether questions of intent and hubris are morally relevant for climate response tests. We suggest further research into laboratory-based work and modeling may help to narrow the scientific uncertainties related to climate response tests, and help inform future ethical debate. However, even if such work is pursued, the ethical issues raised by proposed climate response tests are significant and manifold.

  7. Group of scientific experts third technical test (GSETT-III) experiences

    International Nuclear Information System (INIS)

    Dahlman, O.

    1999-01-01

    The purpose of the established verification system is to provide confidence through adequate monitoring, deter clandestine activities and counteract 'false arms'. The task og the Group of Scientific Experts was to design and test the seismic verification system including designing og the international system, sharing knowledge from national programs, encouraging establishment of new monitoring facilities, development of data analysis procedures, conducting large scale testing and training of experts

  8. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    Science.gov (United States)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  9. Results of data base management system parameterized performance testing related to GSFC scientific applications

    Science.gov (United States)

    Carchedi, C. H.; Gough, T. L.; Huston, H. A.

    1983-01-01

    The results of a variety of tests designed to demonstrate and evaluate the performance of several commercially available data base management system (DBMS) products compatible with the Digital Equipment Corporation VAX 11/780 computer system are summarized. The tests were performed on the INGRES, ORACLE, and SEED DBMS products employing applications that were similar to scientific applications under development by NASA. The objectives of this testing included determining the strength and weaknesses of the candidate systems, performance trade-offs of various design alternatives and the impact of some installation and environmental (computer related) influences.

  10. Realizing rigor in the mathematics classroom

    CERN Document Server

    Hull, Ted H (Henry); Balka, Don S

    2014-01-01

    Rigor put within reach! Rigor: The Common Core has made it policy-and this first-of-its-kind guide takes math teachers and leaders through the process of making it reality. Using the Proficiency Matrix as a framework, the authors offer proven strategies and practical tools for successful implementation of the CCSS mathematical practices-with rigor as a central objective. You'll learn how to Define rigor in the context of each mathematical practice Identify and overcome potential issues, including differentiating instruction and using data

  11. Testing foreign language impact on engineering students' scientific problem-solving performance

    Science.gov (United States)

    Tatzl, Dietmar; Messnarz, Bernd

    2013-12-01

    This article investigates the influence of English as the examination language on the solution of physics and science problems by non-native speakers in tertiary engineering education. For that purpose, a statistically significant total number of 96 students in four year groups from freshman to senior level participated in a testing experiment in the Degree Programme of Aviation at the FH JOANNEUM University of Applied Sciences, Graz, Austria. Half of each test group were given a set of 12 physics problems described in German, the other half received the same set of problems described in English. It was the goal to test linguistic reading comprehension necessary for scientific problem solving instead of physics knowledge as such. The results imply that written undergraduate English-medium engineering tests and examinations may not require additional examination time or language-specific aids for students who have reached university-entrance proficiency in English as a foreign language.

  12. Performance Validity Testing in Neuropsychology: Scientific Basis and Clinical Application-A Brief Review.

    Science.gov (United States)

    Greher, Michael R; Wodushek, Thomas R

    2017-03-01

    Performance validity testing refers to neuropsychologists' methodology for determining whether neuropsychological test performances completed in the course of an evaluation are valid (ie, the results of true neurocognitive function) or invalid (ie, overly impacted by the patient's effort/engagement in testing). This determination relies upon the use of either standalone tests designed for this sole purpose, or specific scores/indicators embedded within traditional neuropsychological measures that have demonstrated this utility. In response to a greater appreciation for the critical role that performance validity issues play in neuropsychological testing and the need to measure this variable to the best of our ability, the scientific base for performance validity testing has expanded greatly over the last 20 to 30 years. As such, the majority of current day neuropsychologists in the United States use a variety of measures for the purpose of performance validity testing as part of everyday forensic and clinical practice and address this issue directly in their evaluations. The following is the first article of a 2-part series that will address the evolution of performance validity testing in the field of neuropsychology, both in terms of the science as well as the clinical application of this measurement technique. The second article of this series will review performance validity tests in terms of methods for development of these measures, and maximizing of diagnostic accuracy.

  13. Classroom Talk for Rigorous Reading Comprehension Instruction

    Science.gov (United States)

    Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.

    2004-01-01

    This study examined the quality of classroom talk and its relation to academic rigor in reading-comprehension lessons. Additionally, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data for this study included 21 reading-comprehension lessons in several elementary and middle schools from…

  14. Evaluation of the Thermo Scientific SureTect Listeria species assay. AOAC Performance Tested Method 071304.

    Science.gov (United States)

    Cloke, Jonathan; Evans, Katharine; Crabtree, David; Hughes, Annette; Simpson, Helen; Holopainen, Jani; Wickstrand, Nina; Kauppinen, Mikko; Leon-Velarde, Carlos; Larson, Nathan; Dave, Keron

    2014-01-01

    The Thermo Scientific SureTect Listeria species Assay is a new real-time PCR assay for the detection of all species of Listeria in food and environmental samples. This validation study was conducted using the AOAC Research Institute (RI) Performance Tested Methods program to validate the SureTect Listeria species Assay in comparison to the reference method detailed in International Organization for Standardization 11290-1:1996 including amendment 1:2004 in a variety of foods plus plastic and stainless steel. The food matrixes validated were smoked salmon, processed cheese, fresh bagged spinach, cantaloupe, cooked prawns, cooked sliced turkey meat, cooked sliced ham, salami, pork frankfurters, and raw ground beef. All matrixes were tested by Thermo Fisher Scientific, Microbiology Division, Basingstoke, UK. In addition, three matrixes (pork frankfurters, fresh bagged spinach, and stainless steel surface samples) were analyzed independently as part of the AOAC-RI-controlled independent laboratory study by the University ofGuelph, Canada. Using probability of detection statistical analysis, a significant difference in favour of the SureTect assay was demonstrated between the SureTect and reference method for high level spiked samples of pork frankfurters, smoked salmon, cooked prawns, stainless steel, and low-spiked samples of salami. For all other matrixes, no significant difference was seen between the two methods during the study. Inclusivity testing was conducted with 68 different isolates of Listeria species, all of which were detected by the SureTect Listeria species Assay. None of the 33 exclusivity isolates were detected by the SureTect Listeria species Assay. Ruggedness testing was conducted to evaluate the performance of the assay with specific method deviations outside of the recommended parameters open to variation, which demonstrated that the assay gave reliable performance. Accelerated stability testing was additionally conducted, validating the assay

  15. Operational Review of the First Wireline In Situ Stress Test in Scientific Ocean Drilling

    Directory of Open Access Journals (Sweden)

    Casey Moore

    2012-04-01

    Full Text Available Scientific ocean drilling’s first in situ stress measurement was made at Site C0009A during Integrated Ocean Drilling Program (IODP Expedition 319 as part of Nankai Trough Seismogenic Zone Experiment (NanTroSEIZE Stage 2. The Modular Formation Dynamics Tester (MDT, Schlumbergerwireline logging tool was deployed in riser Hole C0009A to measure in situ formation pore pressure, formation permeability (often reported as mobility=permeability/viscosity, and the least principal stress (S3 at several isolated depths (Saffer et al., 2009; Expedition 319 Scientists, 2010. The importance of in situ stress measurements is not only for scientific interests in active tectonic drilling, but also for geomechanical and well bore stability analyses. Certain in situ tools were not previously available for scientific ocean drilling due to the borehole diameter and open hole limits of riserless drilling. The riser-capable drillship, D/V Chikyu,now in service for IODP expeditions, allows all of the techniques available to estimate the magnitudes and orientations of 3-D stresses to be used. These techniques include downhole density logging for vertical stress, breakout and caliper log analyses for maximum horizontal stress, core-based anelastic strain recovery (ASR, used in the NanTroSEIZE expeditions in 2007–2008, and leak-off test (Lin et al., 2008 and minifrac/hydraulic fracturing (NanTroSEIZE Expedition319 in 2009. In this report, the whole operational planning process related to in situ measurements is reviewed, and lessons learned from Expedition 319 are summarized for efficient planning and testing in the future.

  16. Rigor mortis in an unusual position: Forensic considerations.

    Science.gov (United States)

    D'Souza, Deepak H; Harish, S; Rajesh, M; Kiran, J

    2011-07-01

    We report a case in which the dead body was found with rigor mortis in an unusual position. The dead body was lying on its back with limbs raised, defying gravity. Direction of the salivary stains on the face was also defying the gravity. We opined that the scene of occurrence of crime is unlikely to be the final place where the dead body was found. The clues were revealing a homicidal offence and an attempt to destroy the evidence. The forensic use of 'rigor mortis in an unusual position' is in furthering the investigations, and the scientific confirmation of two facts - the scene of death (occurrence) is different from the scene of disposal of dead body, and time gap between the two places.

  17. Analysis of student’s scientific literacy skills through socioscientific issue’s test on biodiversity topics

    Science.gov (United States)

    Purwani, L. D.; Sudargo, F.; Surakusumah, W.

    2018-05-01

    The aim of this study was to describe student’s scientific literacy skills on biodiversity topics at grade X of senior high school. Dimension of scientific literacy that was asses is science’s competence and attitude towards science. The science competency tests and attitude rating scale based on biodiversity’s socio-scientific issue is used to measure scientific literacy skills. The result of study showed that student’s scientific literacy skills for science competence dimension are low (15.84% for class A and 19.50% for class B) and also for attitude toward science dimension (31.15% for class A and 37.05%). We concluded that student’s scientific literacy skills are low (23.49% and 28.55%).

  18. Evaluation of the Thermo Scientific SureTect Salmonella species assay. AOAC Performance Tested Method 051303.

    Science.gov (United States)

    Cloke, Jonathan; Clark, Dorn; Radcliff, Roy; Leon-Velarde, Carlos; Larson, Nathan; Dave, Keron; Evans, Katharine; Crabtree, David; Hughes, Annette; Simpson, Helen; Holopainen, Jani; Wickstrand, Nina; Kauppinen, Mikko

    2014-01-01

    The Thermo Scientific SureTect Salmonella species Assay is a new real-time PCR assay for the detection of Salmonellae in food and environmental samples. This validation study was conducted using the AOAC Research Institute (RI) Performance Tested Methods program to validate the SureTect Salmonella species Assay in comparison to the reference method detailed in International Organization for Standardization 6579:2002 in a variety of food matrixes, namely, raw ground beef, raw chicken breast, raw ground pork, fresh bagged lettuce, pork frankfurters, nonfat dried milk powder, cooked peeled shrimp, pasteurized liquid whole egg, ready-to-eat meal containing beef, and stainless steel surface samples. With the exception of liquid whole egg and fresh bagged lettuce, which were tested in-house, all matrixes were tested by Marshfield Food Safety, Marshfield, WI, on behalf of Thermo Fisher Scientific. In addition, three matrixes (pork frankfurters, lettuce, and stainless steel surface samples) were analyzed independently as part of the AOAC-RI-controlled laboratory study by the University of Guelph, Canada. No significant difference by probability of detection or McNemars Chi-squared statistical analysis was found between the candidate or reference methods for any of the food matrixes or environmental surface samples tested during the validation study. Inclusivity and exclusivity testing was conducted with 117 and 36 isolates, respectively, which demonstrated that the SureTect Salmonella species Assay was able to detect all the major groups of Salmonella enterica subspecies enterica (e.g., Typhimurium) and the less common subspecies of S. enterica (e.g., arizoniae) and the rarely encountered S. bongori. None of the exclusivity isolates analyzed were detected by the SureTect Salmonella species Assay. Ruggedness testing was conducted to evaluate the performance of the assay with specific method deviations outside of the recommended parameters open to variation (enrichment time

  19. A Polar Rover for Large-Scale Scientific Surveys: Design, Implementation and Field Test Results

    Directory of Open Access Journals (Sweden)

    Yuqing He

    2015-10-01

    Full Text Available Exploration of polar regions is of great importance to scientific research. Unfortunately, due to the harsh environment, most of the regions on the Antarctic continent are still unreachable for humankind. Therefore, in 2011, the Chinese National Antarctic Research Expedition (CHINARE launched a project to design a rover to conduct large-scale scientific surveys on the Antarctic. The main challenges for the rover are twofold: one is the mobility, i.e., how to make a rover that could survive the harsh environment and safely move on the uneven, icy and snowy terrain; the other is the autonomy, in that the robot should be able to move at a relatively high speed with little or no human intervention so that it can explore a large region in a limit time interval under the communication constraints. In this paper, the corresponding techniques, especially the polar rover's design and autonomous navigation algorithms, are introduced in detail. Subsequently, an experimental report of the fields tests on the Antarctic is given to show some preliminary evaluation of the rover. Finally, experiences and existing challenging problems are summarized.

  20. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  1. Advancing nuclear technology and research. The advanced test reactor national scientific user facility

    Energy Technology Data Exchange (ETDEWEB)

    Benson, Jeff B; Marshall, Frances M [Idaho National Laboratory, Idaho Falls, ID (United States); Allen, Todd R [Univ. of Wisconsin, Madison, WI (United States)

    2012-03-15

    The Advanced Test Reactor (ATR), at the Idaho National Laboratory (INL), is one of the world's premier test reactors for providing the capability for studying the effects of intense neutron and gamma radiation on reactor materials and fuels. The INL also has several hot cells and other laboratories in which irradiated material can be examined to study material radiation effects. In 2007 the US Department of Energy (DOE) designated the ATR as a National Scientific User Facility (NSUF) to facilitate greater access to the ATR and the associated INL laboratories for material testing research. The mission of the ATR NSUF is to provide access to world-class facilities, thereby facilitating the advancement of nuclear science and technology. Cost free access to the ATR, INL post irradiation examination facilities, and partner facilities is granted based on technical merit to U.S. university-led experiment teams conducting non-proprietary research. Proposals are selected via independent technical peer review and relevance to United States Department of Energy. To increase overall research capability, ATR NSUF seeks to form strategic partnerships with university facilities that add significant nuclear research capability to the ATR NSUF and are accessible to all ATR NSUF users. (author)

  2. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  3. Mount Elbert Gas Hydrate Stratigraphic Test Well, Alaska North Slope: Overview of scientific and technical program

    Science.gov (United States)

    Hunter, R.B.; Collett, T.S.; Boswell, R.; Anderson, B.J.; Digert, S.A.; Pospisil, G.; Baker, R.; Weeks, M.

    2011-01-01

    scientific research programs can be safely, effectively, and efficiently conducted within ANS infrastructure. The program success resulted in a technical team recommendation to project management to drill and complete a long-term production test within the area of existing ANS infrastructure. If approved by stakeholders, this long-term test would build on prior arctic research efforts to better constrain the potential gas rates and volumes that could be produced from gas hydrate-bearing sand reservoirs. ?? 2010 Elsevier Ltd.

  4. "Rigor mortis" in a live patient.

    Science.gov (United States)

    Chakravarthy, Murali

    2010-03-01

    Rigor mortis is conventionally a postmortem change. Its occurrence suggests that death has occurred at least a few hours ago. The authors report a case of "Rigor Mortis" in a live patient after cardiac surgery. The likely factors that may have predisposed such premortem muscle stiffening in the reported patient are, intense low cardiac output status, use of unusually high dose of inotropic and vasopressor agents and likely sepsis. Such an event may be of importance while determining the time of death in individuals such as described in the report. It may also suggest requirement of careful examination of patients with muscle stiffening prior to declaration of death. This report is being published to point out the likely controversies that might arise out of muscle stiffening, which should not always be termed rigor mortis and/ or postmortem.

  5. [Rigor mortis -- a definite sign of death?].

    Science.gov (United States)

    Heller, A R; Müller, M P; Frank, M D; Dressler, J

    2005-04-01

    In the past years an ongoing controversial debate exists in Germany, regarding quality of the coroner's inquest and declaration of death by physicians. We report the case of a 90-year old female, who was found after an unknown time following a suicide attempt with benzodiazepine. The examination of the patient showed livores (mortis?) on the left forearm and left lower leg. Moreover, rigor (mortis?) of the left arm was apparent which prevented arm flexion and extension. The hypothermic patient with insufficient respiration was intubated and mechanically ventilated. Chest compressions were not performed, because central pulses were (hardly) palpable and a sinus bradycardia 45/min (AV-block 2 degrees and sole premature ventricular complexes) was present. After placement of an intravenous line (17 G, external jugular vein) the hemodynamic situation was stabilized with intermittent boli of epinephrine and with sodium bicarbonate. With improved circulation livores and rigor disappeared. In the present case a minimal central circulation was noted, which could be stabilized, despite the presence of certain signs of death ( livores and rigor mortis). Considering the finding of an abrogated peripheral perfusion (livores), we postulate a centripetal collapse of glycogen and ATP supply in the patients left arm (rigor), which was restored after resuscitation and reperfusion. Thus, it appears that livores and rigor are not sensitive enough to exclude a vita minima, in particular in hypothermic patients with intoxications. Consequently a careful ABC-check should be performed even in the presence of apparently certain signs of death, to avoid underdiagnosing a vita minima. Additional ECG- monitoring is required to reduce the rate of false positive declarations of death. To what extent basic life support by paramedics should commence when rigor and livores are present until physician DNR order, deserves further discussion.

  6. An ultramicroscopic study on rigor mortis.

    Science.gov (United States)

    Suzuki, T

    1976-01-01

    Gastrocnemius muscles taken from decapitated mice at various intervals after death and from mice killed by 2,4-dinitrophenol or mono-iodoacetic acid injection to induce rigor mortis soon after death, were observed by electron microscopy. The prominent appearance of many fine cross striations in the myofibrils (occurring about every 400 A) was considered to be characteristic of rigor mortis. These striations were caused by minute granules studded along the surfaces of both thick and thin filaments and appeared to be the bridges connecting the 2 kinds of filaments and accounted for the hardness and rigidity of the muscle.

  7. The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System

    Science.gov (United States)

    Mixon, Jason; Stuart, Jerry

    2009-01-01

    In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…

  8. What Constitutes Science and Scientific Evidence: Roles of Null Hypothesis Testing

    Science.gov (United States)

    Chang, Mark

    2017-01-01

    We briefly discuss the philosophical basis of science, causality, and scientific evidence, by introducing the hidden but most fundamental principle of science: the similarity principle. The principle's use in scientific discovery is illustrated with Simpson's paradox and other examples. In discussing the value of null hypothesis statistical…

  9. Advanced Test Reactor National Scientific User Facility (ATR NSUF) Monthly Report October 2014

    Energy Technology Data Exchange (ETDEWEB)

    Ogden, Dan [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-10-01

    Advanced Test Reactor National Scientific User Facility (ATR NSUF) Monthly Report October 2014 Highlights • Rory Kennedy, Dan Ogden and Brenden Heidrich traveled to Germantown October 6-7, for a review of the Infrastructure Management mission with Shane Johnson, Mike Worley, Bradley Williams and Alison Hahn from NE-4 and Mary McCune from NE-3. Heidrich briefed the group on the project progress from July to October 2014 as well as the planned path forward for FY15. • Jim Cole gave two invited university seminars at Ohio State University and University of Florida, providing an overview of NSUF including available capabilities and the process for accessing facilities through the peer reviewed proposal process. • Jim Cole and Rory Kennedy co-chaired the NuMat meeting with Todd Allen. The meeting, sponsored by Elsevier publishing, was held in Clearwater, Florida, and is considered one of the premier nuclear fuels and materials conferences. Over 340 delegates attended with 160 oral and over 200 posters presented over 4 days. • Thirty-one pre-applications were submitted for NSUF access through the NE-4 Combined Innovative Nuclear Research Funding Opportunity Announcement. • Fourteen proposals were received for the NSUF Rapid Turnaround Experiment Summer 2014 call. Proposal evaluations are underway. • John Jackson and Rory Kennedy attended the Nuclear Fuels Industry Research meeting. Jackson presented an overview of ongoing NSUF industry research.

  10. Trends: Rigor Mortis in the Arts.

    Science.gov (United States)

    Blodget, Alden S.

    1991-01-01

    Outlines how past art education provided a refuge for students from the rigors of other academic subjects. Observes that in recent years art education has become "discipline based." Argues that art educators need to reaffirm their commitment to a humanistic way of knowing. (KM)

  11. Photoconductivity of amorphous silicon-rigorous modelling

    International Nuclear Information System (INIS)

    Brada, P.; Schauer, F.

    1991-01-01

    It is our great pleasure to express our gratitude to Prof. Grigorovici, the pioneer of the exciting field of amorphous state by our modest contribution to this area. In this paper are presented the outline of the rigorous modelling program of the steady-state photoconductivity in amorphous silicon and related materials. (Author)

  12. Rover-Based Instrumentation and Scientific Investigations During the 2012 Analog Field Test on Mauna Kea Volcano, Hawaii

    Science.gov (United States)

    Graham, L. D.; Graff, T. G.

    2013-01-01

    Rover-based 2012 Moon and Mars Analog Mission Activities (MMAMA) were recently completed on Mauna Kea Volcano, Hawaii. Scientific investigations, scientific input, and operational constraints were tested in the context of existing project and protocols for the field activities designed to help NASA achieve the Vision for Space Exploration [1]. Several investigations were conducted by the rover mounted instruments to determine key geophysical and geochemical properties of the site, as well as capture the geological context of the area and the samples investigated. The rover traverse and associated science investigations were conducted over a three day period on the southeast flank of the Mauna Kea Volcano, Hawaii. The test area was at an elevation of 11,500 feet and is known as "Apollo Valley" (Fig. 1). Here we report the integration and operation of the rover-mounted instruments, as well as the scientific investigations that were conducted.

  13. Scientific design of the test facility for the KNGR DVI line small break LOCA

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Byong Jo; Park, Choon Kyung; Jun, Hyung Gil; Cho, Seok; Kwon, Tae Soon; Song, Chul Hwa; Kim, Jung Taek

    1999-03-01

    Scientific design of the experimental facility (OASIS) for the KNGR (Korea Next Generation Reactor) DVI line SB-LOCA simulation is carried out. Main purpose of the OASIS is to produce thermal-hydraulic data base for determining the best location of the DVI (Direct Vessel Injection) injection nozzle of the KNGR as well as verifying its design performance in view of the ECCS (Emergency Core Cooling System) effectiveness. The experimental facility is designed based on the Ishii's three-level scaling law. The facility has 1/4 height and 1/341 area scaling ratio. It corresponds to the volume scale of 1/1364. The power scaling is 1/682 and the system pressure is prototypic. The OASIS consists of a core, a downcomer, two steam generators, two pump simulators, a break simulator, a collection tank, primary piping as well as a circulation pump for initial test condition. Each component is designed based on the Ishill's global scaling and boundary flow scaling of mass, energy and momentum. In addition, local phenomena scaling is carried out for the design of major components to preserve key local phenomena in each component. Most of the key phenomena are well preserved in the OASIS. However, the local scaling analysis shows that distortions of the void fraction and mixture level can not be avoided in the core. It comes from the basic features of the Ishill's scaling law in case of the reduced-height simulation. However, it is expected that these distortions will be analyzed properly by a best estimate system analysis code. (Author). 22 refs., 20 tabs., 25 figs.

  14. The advanced test reactor national scientific user facility advancing nuclear technology

    International Nuclear Information System (INIS)

    Allen, T.R.; Thelen, M.C.; Meyer, M.K.; Marshall, F.M.; Foster, J.; Benson, J.B.

    2009-01-01

    To help ensure the long-term viability of nuclear energy through a robust and sustained research and development effort, the U.S. Department of Energy (DOE) designated the Advanced Test Reactor and associated post-irradiation examination facilities a National Scientific User Facility (ATR NSUF), allowing broader access to nuclear energy researchers. The mission of the ATR NSUF is to provide access to world-class nuclear research facilities, thereby facilitating the advancement of nuclear science and technology. The ATR NSUF seeks to create an engaged academic and industrial user community that routinely conducts reactor-based research. Cost free access to the ATR and PIE facilities is granted based on technical merit to U.S. university-led experiment teams conducting non-proprietary research. Proposals are selected via independent technical peer review and relevance to DOE mission. Extensive publication of research results is expected as a condition for access. During FY 2008, the first full year of ATR NSUF operation, five university-led experiments were awarded access to the ATR and associated post-irradiation examination facilities. The ATR NSUF has awarded four new experiments in early FY 2009, and anticipates awarding additional experiments in the fall of 2009 as the results of the second 2009 proposal call. As the ATR NSUF program mature over the next two years, the capability to perform irradiation research of increasing complexity will become available. These capabilities include instrumented irradiation experiments and post-irradiation examinations on materials previously irradiated in U.S. reactor material test programs. The ATR critical facility will also be made available to researchers. An important component of the ATR NSUF an education program focused on the reactor-based tools available for resolving nuclear science and technology issues. The ATR NSUF provides education programs including a summer short course, internships, faculty-student team

  15. The Advanced Test Reactor National Scientific User Facility Advancing Nuclear Technology

    International Nuclear Information System (INIS)

    Allen, T.R.; Benson, J.B.; Foster, J.A.; Marshall, F.M.; Meyer, M.K.; Thelen, M.C.

    2009-01-01

    To help ensure the long-term viability of nuclear energy through a robust and sustained research and development effort, the U.S. Department of Energy (DOE) designated the Advanced Test Reactor and associated post-irradiation examination facilities a National Scientific User Facility (ATR NSUF), allowing broader access to nuclear energy researchers. The mission of the ATR NSUF is to provide access to world-class nuclear research facilities, thereby facilitating the advancement of nuclear science and technology. The ATR NSUF seeks to create an engaged academic and industrial user community that routinely conducts reactor-based research. Cost free access to the ATR and PIE facilities is granted based on technical merit to U.S. university-led experiment teams conducting non-proprietary research. Proposals are selected via independent technical peer review and relevance to DOE mission. Extensive publication of research results is expected as a condition for access. During FY 2008, the first full year of ATR NSUF operation, five university-led experiments were awarded access to the ATR and associated post-irradiation examination facilities. The ATR NSUF has awarded four new experiments in early FY 2009, and anticipates awarding additional experiments in the fall of 2009 as the results of the second 2009 proposal call. As the ATR NSUF program mature over the next two years, the capability to perform irradiation research of increasing complexity will become available. These capabilities include instrumented irradiation experiments and post-irradiation examinations on materials previously irradiated in U.S. reactor material test programs. The ATR critical facility will also be made available to researchers. An important component of the ATR NSUF an education program focused on the reactor-based tools available for resolving nuclear science and technology issues. The ATR NSUF provides education programs including a summer short course, internships, faculty-student team

  16. The re-emergence of hyphenated history-and-philosophy-of-science and the testing of theories of scientific change.

    Science.gov (United States)

    Laudan, Larry; Laudan, Rachel

    2016-10-01

    A basic premise of hyphenated history-and-philosophy-of-science is that theories of scientific change have to be based on empirical evidence derived from carefully constructed historical case studies. This paper analyses one such systematic attempt to test philosophical claims, describing its historical context, rationale, execution, and limited impact. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A Test of the Circumvention-of-Limits Hypothesis in Scientific Problem Solving: The Case of Geological Bedrock Mapping

    Science.gov (United States)

    Hambrick, David Z.; Libarkin, Julie C.; Petcovic, Heather L.; Baker, Kathleen M.; Elkins, Joe; Callahan, Caitlin N.; Turner, Sheldon P.; Rench, Tara A.; LaDue, Nicole D.

    2012-01-01

    Sources of individual differences in scientific problem solving were investigated. Participants representing a wide range of experience in geology completed tests of visuospatial ability and geological knowledge, and performed a geological bedrock mapping task, in which they attempted to infer the geological structure of an area in the Tobacco…

  18. Accelerating Biomedical Discoveries through Rigor and Transparency.

    Science.gov (United States)

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  19. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Science.gov (United States)

    Gallego, Sergi; Neipp, Cristian; Estepa, Luis A.; Ortuño, Manuel; Márquez, Andrés; Francés, Jorge; Pascual, Inmaculada; Beléndez, Augusto

    2012-01-01

    There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW) or the Rigorous Coupled Wave theory (RCW). The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  20. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Directory of Open Access Journals (Sweden)

    Augusto Beléndez

    2012-08-01

    Full Text Available There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW or the Rigorous Coupled Wave theory (RCW. The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  1. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  2. Advanced Test Reactor National Scientific User Facility (ATR NSUF) Monthly Report November 2014

    Energy Technology Data Exchange (ETDEWEB)

    Soelberg, Renae [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-11-01

    Advanced Test Reactor National Scientific User Facility (ATR NSUF) Monthly Report November 2014 Highlights Rory Kennedy and Sarah Robertson attended the American Nuclear Society Winter Meeting and Nuclear Technology Expo in Anaheim, California, Nov. 10-13. ATR NSUF exhibited at the technology expo where hundreds of meeting participants had an opportunity to learn more about ATR NSUF. Dr. Kennedy briefed the Nuclear Engineering Department Heads Organization (NEDHO) on the workings of the ATR NSUF. • Rory Kennedy, James Cole and Dan Ogden participated in a reactor instrumentation discussion with Jean-Francois Villard and Christopher Destouches of CEA and several members of the INL staff. • ATR NSUF received approval from the NE-20 office to start planning the annual Users Meeting. The meeting will be held at INL, June 22-25. • Mike Worley, director of the Office of Innovative Nuclear Research (NE-42), visited INL Nov. 4-5. Milestones Completed • Recommendations for the Summer Rapid Turnaround Experiment awards were submitted to DOE-HQ Nov. 12 (Level 2 milestone due Nov. 30). Major Accomplishments/Activities • The University of California, Santa Barbara 2 experiment was unloaded from the GE-2000 at HFEF. The experiment specimen packs will be removed and shipped to ORNL for PIE. • The Terrani experiment, one of three FY 2014 new awards, was completed utilizing the Advanced Photon Source MRCAT beamline. The experiment investigated the chemical state of Ag and Pd in SiC shell of irradiated TRISO particles via X-ray Absorption Fine Structure (XAFS) spectroscopy. Upcoming Meetings/Events • The ATR NSUF program review meeting will be held Dec. 9-10 at L’Enfant Plaza. In addition to NSUF staff and users, NE-4, NE-5 and NE-7 representatives will attend the meeting. Awarded Research Projects Boise State University Rapid Turnaround Experiments (14-485 and 14-486) Nanoindentation and TEM work on the T91, HT9, HCM12A and 9Cr ODS specimens has been completed at

  3. Beyond the RCT: Integrating Rigor and Relevance to Evaluate the Outcomes of Domestic Violence Programs

    Science.gov (United States)

    Goodman, Lisa A.; Epstein, Deborah; Sullivan, Cris M.

    2018-01-01

    Programs for domestic violence (DV) victims and their families have grown exponentially over the last four decades. The evidence demonstrating the extent of their effectiveness, however, often has been criticized as stemming from studies lacking scientific rigor. A core reason for this critique is the widespread belief that credible evidence can…

  4. English Language Test for Scientific Staff at D.U.T.

    NARCIS (Netherlands)

    Klaassen, R.G.; Bos, M.H.P.C.; Roubos, Tim; Veronesi, Daniela; Nickenig, Christoph

    2009-01-01

    Delft University of Technology (DUT) screened her (non-native English) scientific staff on their level of language proficiency over the year academic 2006/2007. In this paper the large scale operation, involving planning, policy decisions, assessment means, advise and training are discussed. Results

  5. Nine Criteria for a Measure of Scientific Output

    Science.gov (United States)

    Kreiman, Gabriel; Maunsell, John H. R.

    2011-01-01

    Scientific research produces new knowledge, technologies, and clinical treatments that can lead to enormous returns. Often, the path from basic research to new paradigms and direct impact on society takes time. Precise quantification of scientific output in the short-term is not an easy task but is critical for evaluating scientists, laboratories, departments, and institutions. While there have been attempts to quantifying scientific output, we argue that current methods are not ideal and suffer from solvable difficulties. Here we propose criteria that a metric should have to be considered a good index of scientific output. Specifically, we argue that such an index should be quantitative, based on robust data, rapidly updated and retrospective, presented with confidence intervals, normalized by number of contributors, career stage and discipline, impractical to manipulate, and focused on quality over quantity. Such an index should be validated through empirical testing. The purpose of quantitatively evaluating scientific output is not to replace careful, rigorous review by experts but rather to complement those efforts. Because it has the potential to greatly influence the efficiency of scientific research, we have a duty to reflect upon and implement novel and rigorous ways of evaluating scientific output. The criteria proposed here provide initial steps toward the systematic development and validation of a metric to evaluate scientific output. PMID:22102840

  6. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  7. Development of rigor mortis is not affected by muscle volume.

    Science.gov (United States)

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  8. New rigorous asymptotic theorems for inverse scattering amplitudes

    International Nuclear Information System (INIS)

    Lomsadze, Sh.Yu.; Lomsadze, Yu.M.

    1984-01-01

    The rigorous asymptotic theorems both of integral and local types obtained earlier and establishing logarithmic and in some cases even power correlations aetdeen the real and imaginary parts of scattering amplitudes Fsub(+-) are extended to the inverse amplitudes 1/Fsub(+-). One also succeeds in establishing power correlations of a new type between the real and imaginary parts, both for the amplitudes themselves and for the inverse ones. All the obtained assertions are convenient to be tested in high energy experiments when the amplitudes show asymptotic behaviour

  9. Fast and Rigorous Assignment Algorithm Multiple Preference and Calculation

    Directory of Open Access Journals (Sweden)

    Ümit Çiftçi

    2010-03-01

    Full Text Available The goal of paper is to develop an algorithm that evaluates students then places them depending on their desired choices according to dependant preferences. The developed algorithm is also used to implement software. The success and accuracy of the software as well as the algorithm are tested by applying it to ability test at Beykent University. This ability test is repeated several times in order to fill all available places at Fine Art Faculty departments in every academic year. It has been shown that this algorithm is very fast and rigorous after application of 2008-2009 and 2009-20010 academic years.Key Words: Assignment algorithm, student placement, ability test

  10. 76 FR 77833 - Scientific Information Request on CYP2C19 Variants and Platelet Reactivity Tests

    Science.gov (United States)

    2011-12-14

    ... program. AHRQ is not requesting and will not consider marketing material, health economics information, or... of alternative test-and-treat strategies (including a no-testing strategy) for therapeutic decision...? a. What is the comparative effectiveness of the following testing strategies on therapeutic decision...

  11. Rigor in Qualitative Supply Chain Management Research

    DEFF Research Database (Denmark)

    Goffin, Keith; Raja, Jawwad; Claes, Björn

    2012-01-01

    , reliability, and theoretical saturation. Originality/value – It is the authors' contention that the addition of the repertory grid technique to the toolset of methods used by logistics and supply chain management researchers can only enhance insights and the building of robust theories. Qualitative studies......Purpose – The purpose of this paper is to share the authors' experiences of using the repertory grid technique in two supply chain management studies. The paper aims to demonstrate how the two studies provided insights into how qualitative techniques such as the repertory grid can be made more...... rigorous than in the past, and how results can be generated that are inaccessible using quantitative methods. Design/methodology/approach – This paper presents two studies undertaken using the repertory grid technique to illustrate its application in supply chain management research. Findings – The paper...

  12. Statistics for mathematicians a rigorous first course

    CERN Document Server

    Panaretos, Victor M

    2016-01-01

    This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.

  13. A Delphi Technology Foresight Study: Mapping Social Construction of Scientific Evidence on Metagenomics Tests for Water Safety.

    Directory of Open Access Journals (Sweden)

    Stanislav Birko

    Full Text Available Access to clean water is a grand challenge in the 21st century. Water safety testing for pathogens currently depends on surrogate measures such as fecal indicator bacteria (e.g., E. coli. Metagenomics concerns high-throughput, culture-independent, unbiased shotgun sequencing of DNA from environmental samples that might transform water safety by detecting waterborne pathogens directly instead of their surrogates. Yet emerging innovations such as metagenomics are often fiercely contested. Innovations are subject to shaping/construction not only by technology but also social systems/values in which they are embedded, such as experts' attitudes towards new scientific evidence. We conducted a classic three-round Delphi survey, comprised of 107 questions. A multidisciplinary expert panel (n = 24 representing the continuum of discovery scientists and policymakers evaluated the emergence of metagenomics tests. To the best of our knowledge, we report here the first Delphi foresight study of experts' attitudes on (1 the top 10 priority evidentiary criteria for adoption of metagenomics tests for water safety, (2 the specific issues critical to governance of metagenomics innovation trajectory where there is consensus or dissensus among experts, (3 the anticipated time lapse from discovery to practice of metagenomics tests, and (4 the role and timing of public engagement in development of metagenomics tests. The ability of a test to distinguish between harmful and benign waterborne organisms, analytical/clinical sensitivity, and reproducibility were the top three evidentiary criteria for adoption of metagenomics. Experts agree that metagenomic testing will provide novel information but there is dissensus on whether metagenomics will replace the current water safety testing methods or impact the public health end points (e.g., reduction in boil water advisories. Interestingly, experts view the publics relevant in a "downstream capacity" for adoption of

  14. Illusions of scientific legitimacy: misrepresented science in the direct-to-consumer genetic-testing marketplace.

    Science.gov (United States)

    Vashlishan Murray, Amy B; Carson, Michael J; Morris, Corey A; Beckwith, Jon

    2010-11-01

    Marketers of genetic tests often openly or implicitly misrepresent the utility of genetic information. Scientists who are well aware of the current limitations to the utility of such tests are best placed to publicly counter misrepresentations of the science. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. WASP (Write a Scientific Paper) using Excel - 8: t-Tests.

    Science.gov (United States)

    Grech, Victor

    2018-06-01

    t-Testing is a common component of inferential statistics when comparing two means. This paper explains the central limit theorem and the concept of the null hypothesis as well as types of errors. On the practical side, this paper outlines how different t-tests may be performed in Microsoft Excel, for different purposes, both statically as well as dynamically, with Excel's functions. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Rigorous theory of molecular orientational nonlinear optics

    International Nuclear Information System (INIS)

    Kwak, Chong Hoon; Kim, Gun Yeup

    2015-01-01

    Classical statistical mechanics of the molecular optics theory proposed by Buckingham [A. D. Buckingham and J. A. Pople, Proc. Phys. Soc. A 68, 905 (1955)] has been extended to describe the field induced molecular orientational polarization effects on nonlinear optics. In this paper, we present the generalized molecular orientational nonlinear optical processes (MONLO) through the calculation of the classical orientational averaging using the Boltzmann type time-averaged orientational interaction energy in the randomly oriented molecular system under the influence of applied electric fields. The focal points of the calculation are (1) the derivation of rigorous tensorial components of the effective molecular hyperpolarizabilities, (2) the molecular orientational polarizations and the electronic polarizations including the well-known third-order dc polarization, dc electric field induced Kerr effect (dc Kerr effect), optical Kerr effect (OKE), dc electric field induced second harmonic generation (EFISH), degenerate four wave mixing (DFWM) and third harmonic generation (THG). We also present some of the new predictive MONLO processes. For second-order MONLO, second-order optical rectification (SOR), Pockels effect and difference frequency generation (DFG) are described in terms of the anisotropic coefficients of first hyperpolarizability. And, for third-order MONLO, third-order optical rectification (TOR), dc electric field induced difference frequency generation (EFIDFG) and pump-probe transmission are presented

  17. Abrasion Testing of Products Containing Nanomaterials, SOP-R-2: Scientific Operating Procedure Series: Release (R)

    Science.gov (United States)

    2016-04-01

    Nanotechnologies -- Terminology and definitions for nano-objects -- Nanoparticle, nanofibre and nanoplate Definitions Abrasion - wearing away...ER D C SR -1 6- 2 Environmental Consequences of Nanotechnologies Abrasion Testing of Products Containing Nanomaterials, SOP-R-2...ERDC online library at http://acwc.sdp.sirsi.net/client/default. Environmental Consequences of Nanotechnologies ERDC SR-16-2 April 2016

  18. Testing Foreign Language Impact on Engineering Students' Scientific Problem-Solving Performance

    Science.gov (United States)

    Tatzl, Dietmar; Messnarz, Bernd

    2013-01-01

    This article investigates the influence of English as the examination language on the solution of physics and science problems by non-native speakers in tertiary engineering education. For that purpose, a statistically significant total number of 96 students in four year groups from freshman to senior level participated in a testing experiment in…

  19. The fish embryo toxicity test as an animal alternative method in hazard and risk assessment and scientific research

    International Nuclear Information System (INIS)

    Embry, Michelle R.; Belanger, Scott E.; Braunbeck, Thomas A.; Galay-Burgos, Malyka; Halder, Marlies; Hinton, David E.; Leonard, Marc A.; Lillicrap, Adam; Norberg-King, Teresa; Whale, Graham

    2010-01-01

    Animal alternatives research has historically focused on human safety assessments and has only recently been extended to environmental testing. This is particularly for those assays that involve the use of fish. A number of alternatives are being pursued by the scientific community including the fish embryo toxicity (FET) test, a proposed replacement alternative to the acute fish test. Discussion of the FET methodology and its application in environmental assessments on a global level was needed. With this emerging issue in mind, the ILSI Health and Environmental Sciences Institute (HESI) and the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) held an International Workshop on the Application of the Fish Embryo Test as an Animal Alternative Method in Hazard and Risk Assessment and Scientific Research in March, 2008. The workshop included approximately 40 scientists and regulators representing government, industry, academia, and non-governmental organizations from North America, Europe, and Asia. The goal was to review the state of the science regarding the investigation of fish embryonic tests, pain and distress in fish, emerging approaches utilizing fish embryos, and the use of fish embryo toxicity test data in various types of environmental assessments (e.g., hazard, risk, effluent, and classification and labeling of chemicals). Some specific key outcomes included agreement that risk assessors need fish data for decision-making, that extending the FET to include eluethereombryos was desirable, that relevant endpoints are being used, and that additional endpoints could facilitate additional uses beyond acute toxicity testing. The FET was, however, not yet considered validated sensu OECD. An important action step will be to provide guidance on how all fish tests can be used to assess chemical hazard and to harmonize the diverse terminology used in test guidelines adopted over the past decades. Use of the FET in context of effluent assessments

  20. The fish embryo toxicity test as an animal alternative method in hazard and risk assessment and scientific research

    Energy Technology Data Exchange (ETDEWEB)

    Embry, Michelle R., E-mail: membry@ilsi.org [ILSI Health and Environmental Sciences Institute, 1156 15th Street, NW, Suite 200, Washington, DC 20005 (United States); Belanger, Scott E., E-mail: belanger.se@pg.com [Procter and Gamble, Central Product Safety, PO Box 538707, Miami Valley Innovation Center, Cincinnati, OH 45253-8707 (United States); Braunbeck, Thomas A., E-mail: braunbeck@zoo.uni-heidelberg.de [University of Heidelberg, Im Neuenheimer Feld 230, Heidelberg D -69120 (Germany); Galay-Burgos, Malyka, E-mail: malyka.galay-burgos@ecetoc.org [European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC), 4 Avenue E. Van Nieuwenhuyse B-1160, Brussels (Belgium); Halder, Marlies, E-mail: marlies.halder@jrc.ec.europa.eu [European Commission, Joint Research Centre, Institute for Health and Consumer Protection, In-Vitro Methods Unit TP-580 Ispra 21027 (Italy); Hinton, David E., E-mail: dhinton@duke.edu [Duke University, Nicholas School of the Environment, PO Box 90328, Durham, NC 27708, Unites States (United States); Leonard, Marc A., E-mail: mleonard@rd.loreal.com [L' Oreal Recherche Avancee, Unite d' Ecotoxicologie, 1 av. E. Schueller, 93601 Aulnay sous bois (France); Lillicrap, Adam, E-mail: Adam.lillicrap@niva.no [AstraZeneca, Freshwater Quarry, Brixham TQ5 8BA (United Kingdom); Norberg-King, Teresa, E-mail: norberg-king.teresa@epa.gov [U.S. EPA, Mid-Continent Ecology Division, 6201 Congdon Boulevard, Duluth, MN 55804-1636 (United States); Whale, Graham, E-mail: graham.whale@shell.com [Shell Global Solutions, Analytical Technology, P.O. Box 1, Chester CH1 3SH (United Kingdom)

    2010-04-15

    Animal alternatives research has historically focused on human safety assessments and has only recently been extended to environmental testing. This is particularly for those assays that involve the use of fish. A number of alternatives are being pursued by the scientific community including the fish embryo toxicity (FET) test, a proposed replacement alternative to the acute fish test. Discussion of the FET methodology and its application in environmental assessments on a global level was needed. With this emerging issue in mind, the ILSI Health and Environmental Sciences Institute (HESI) and the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) held an International Workshop on the Application of the Fish Embryo Test as an Animal Alternative Method in Hazard and Risk Assessment and Scientific Research in March, 2008. The workshop included approximately 40 scientists and regulators representing government, industry, academia, and non-governmental organizations from North America, Europe, and Asia. The goal was to review the state of the science regarding the investigation of fish embryonic tests, pain and distress in fish, emerging approaches utilizing fish embryos, and the use of fish embryo toxicity test data in various types of environmental assessments (e.g., hazard, risk, effluent, and classification and labeling of chemicals). Some specific key outcomes included agreement that risk assessors need fish data for decision-making, that extending the FET to include eluethereombryos was desirable, that relevant endpoints are being used, and that additional endpoints could facilitate additional uses beyond acute toxicity testing. The FET was, however, not yet considered validated sensu OECD. An important action step will be to provide guidance on how all fish tests can be used to assess chemical hazard and to harmonize the diverse terminology used in test guidelines adopted over the past decades. Use of the FET in context of effluent assessments

  1. Crick's gossip test and Watson's boredom principle: A pseudo-mathematical analysis of effort in scientific research.

    Science.gov (United States)

    Charlton, Bruce G

    2008-01-01

    Crick and Watson gave complementary advice to the aspiring scientist based on the insight that to do your best work you need to make your greatest possible effort. Crick made the positive suggestion to work on the subject which most deeply interests you, the thing about which you spontaneously gossip - Crick termed this 'the gossip test'. Watson made the negative suggestion of avoiding topics and activities that bore you - which I have termed 'the boredom principle'. This is good advice because science is tough and the easy things have already been done. Solving the harder problems that remain requires a lot of effort. But in modern biomedical science individual effort does not necessarily correlate with career success as measured by salary, status, job security, etc. This is because Crick and Watson are talking about revolutionary science - using Thomas Kuhn's distinction between paradigm-shifting 'revolutionary' science and incremental 'normal' science. There are two main problems with pursuing a career in revolutionary science. The first is that revolutionary science is intrinsically riskier than normal science, the second that even revolutionary success in a scientific backwater may be less career-enhancing than mundane work in a trendy field. So, if you pick your scientific problem using the gossip test and the boredom principle, you might also be committing career suicide. This may explain why so few people follow Crick and Watson's advice. The best hope for future biomedical science is that it will evolve towards a greater convergence between individual effort and career success.

  2. Scientific and technological basis for maintenance optimization, planning, testing and monitoring for NPP with WWER

    International Nuclear Information System (INIS)

    Kovrizhkin, Yu.L.; Skalozubov, V.I.; Kochneva, V.Yu.

    2009-01-01

    The main results of the developments in the sphere of NPPs with WWER production efficiency increasing by the way of the maintenance optimization planning, testing and monitoring of the equipment and systems are shown. The attention is paid to the metal control during maintenance period of Power Unit. The realization methods of the transition concept at the repair according to the technical condition are resulted

  3. Accuracy of the Thermo Fisher Scientific (Sensititre™) dry-form broth microdilution MIC product when testing ceftaroline.

    Science.gov (United States)

    Jones, Ronald N; Holliday, Nicole M; Critchley, Ian A

    2015-04-01

    Ceftaroline, the active metabolite of the ceftaroline fosamil pro-drug, was the first advanced-spectrum cephalosporin with potent activity against methicillin-resistant Staphylococcus aureus to be approved by the US Food and Drug Administration for acute bacterial skin and skin structure infections. After 4 years of clinical use, few ceftaroline commercial susceptibility testing devices other than agar diffusion methods (disks and stable gradient) are available. Here, we validate a broth microdilution product (Sensititre™; Thermo Fisher Scientific, Cleveland, OH, USA) that achieved 99.2% essential agreement (manual and automated reading) and 95.3-100.0% categorical agreement, with high reproducibility (98.0-100.0%). Sensititre™ MIC values for ceftaroline, however, were slightly skewed toward an elevated value (0.5 × log2 dilution step), greatest when testing for streptococci and Enterobacteriaceae. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Testing Hypotheses on Risk Factors for Scientific Misconduct via Matched-Control Analysis of Papers Containing Problematic Image Duplications.

    Science.gov (United States)

    Fanelli, Daniele; Costas, Rodrigo; Fang, Ferric C; Casadevall, Arturo; Bik, Elisabeth M

    2018-02-19

    It is commonly hypothesized that scientists are more likely to engage in data falsification and fabrication when they are subject to pressures to publish, when they are not restrained by forms of social control, when they work in countries lacking policies to tackle scientific misconduct, and when they are male. Evidence to test these hypotheses, however, is inconclusive due to the difficulties of obtaining unbiased data. Here we report a pre-registered test of these four hypotheses, conducted on papers that were identified in a previous study as containing problematic image duplications through a systematic screening of the journal PLoS ONE. Image duplications were classified into three categories based on their complexity, with category 1 being most likely to reflect unintentional error and category 3 being most likely to reflect intentional fabrication. We tested multiple parameters connected to the hypotheses above with a matched-control paradigm, by collecting two controls for each paper containing duplications. Category 1 duplications were mostly not associated with any of the parameters tested, as was predicted based on the assumption that these duplications were mostly not due to misconduct. Categories 2 and 3, however, exhibited numerous statistically significant associations. Results of univariable and multivariable analyses support the hypotheses that academic culture, peer control, cash-based publication incentives and national misconduct policies might affect scientific integrity. No clear support was found for the "pressures to publish" hypothesis. Female authors were found to be equally likely to publish duplicated images compared to males. Country-level parameters generally exhibited stronger effects than individual-level parameters, because developing countries were significantly more likely to produce problematic image duplications. This suggests that promoting good research practices in all countries should be a priority for the international

  5. Rigorous derivation of porous-media phase-field equations

    Science.gov (United States)

    Schmuck, Markus; Kalliadasis, Serafim

    2017-11-01

    The evolution of interfaces in Complex heterogeneous Multiphase Systems (CheMSs) plays a fundamental role in a wide range of scientific fields such as thermodynamic modelling of phase transitions, materials science, or as a computational tool for interfacial flow studies or material design. Here, we focus on phase-field equations in CheMSs such as porous media. To the best of our knowledge, we present the first rigorous derivation of error estimates for fourth order, upscaled, and nonlinear evolution equations. For CheMs with heterogeneity ɛ, we obtain the convergence rate ɛ 1 / 4 , which governs the error between the solution of the new upscaled formulation and the solution of the microscopic phase-field problem. This error behaviour has recently been validated computationally in. Due to the wide range of application of phase-field equations, we expect this upscaled formulation to allow for new modelling, analytic, and computational perspectives for interfacial transport and phase transformations in CheMSs. This work was supported by EPSRC, UK, through Grant Nos. EP/H034587/1, EP/L027186/1, EP/L025159/1, EP/L020564/1, EP/K008595/1, and EP/P011713/1 and from ERC via Advanced Grant No. 247031.

  6. KNODWAT: a scientific framework application for testing knowledge discovery methods for the biomedical domain.

    Science.gov (United States)

    Holzinger, Andreas; Zupan, Mario

    2013-06-13

    Professionals in the biomedical domain are confronted with an increasing mass of data. Developing methods to assist professional end users in the field of Knowledge Discovery to identify, extract, visualize and understand useful information from these huge amounts of data is a huge challenge. However, there are so many diverse methods and methodologies available, that for biomedical researchers who are inexperienced in the use of even relatively popular knowledge discovery methods, it can be very difficult to select the most appropriate method for their particular research problem. A web application, called KNODWAT (KNOwledge Discovery With Advanced Techniques) has been developed, using Java on Spring framework 3.1. and following a user-centered approach. The software runs on Java 1.6 and above and requires a web server such as Apache Tomcat and a database server such as the MySQL Server. For frontend functionality and styling, Twitter Bootstrap was used as well as jQuery for interactive user interface operations. The framework presented is user-centric, highly extensible and flexible. Since it enables methods for testing using existing data to assess suitability and performance, it is especially suitable for inexperienced biomedical researchers, new to the field of knowledge discovery and data mining. For testing purposes two algorithms, CART and C4.5 were implemented using the WEKA data mining framework.

  7. Student’s rigorous mathematical thinking based on cognitive style

    Science.gov (United States)

    Fitriyani, H.; Khasanah, U.

    2017-12-01

    The purpose of this research was to determine the rigorous mathematical thinking (RMT) of mathematics education students in solving math problems in terms of reflective and impulsive cognitive styles. The research used descriptive qualitative approach. Subjects in this research were 4 students of the reflective and impulsive cognitive style which was each consisting male and female subjects. Data collection techniques used problem-solving test and interview. Analysis of research data used Miles and Huberman model that was reduction of data, presentation of data, and conclusion. The results showed that impulsive male subjects used three levels of the cognitive function required for RMT that were qualitative thinking, quantitative thinking with precision, and relational thinking completely while the other three subjects were only able to use cognitive function at qualitative thinking level of RMT. Therefore the subject of impulsive male has a better RMT ability than the other three research subjects.

  8. Rigor or mortis: best practices for preclinical research in neuroscience.

    Science.gov (United States)

    Steward, Oswald; Balice-Gordon, Rita

    2014-11-05

    Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. [Experimental study of restiffening of the rigor mortis].

    Science.gov (United States)

    Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M

    2001-11-01

    To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.

  10. New Sensors for In-Pile Temperature Detection at the Advanced Test Reactor National Scientific User Facility

    International Nuclear Information System (INIS)

    Rempe, J.L.; Knudson, D.L.; Daw, J.E.; Condie, K.G.; Wilkins, S. Curtis

    2009-01-01

    The Department of Energy (DOE) designated the Advanced Test Reactor (ATR) as a National Scientific User Facility (NSUF) in April 2007 to support U.S. leadership in nuclear science and technology. As a user facility, the ATR is supporting new users from universities, laboratories, and industry, as they conduct basic and applied nuclear research and development to advance the nation's energy security needs. A key component of the ATR NSUF effort is to develop and evaluate new in-pile instrumentation techniques that are capable of providing measurements of key parameters during irradiation. This paper describes the strategy for determining what instrumentation is needed and the program for developing new or enhanced sensors that can address these needs. Accomplishments from this program are illustrated by describing new sensors now available and under development for in-pile detection of temperature at various irradiation locations in the ATR.

  11. Learning from Science and Sport - How we, Safety, "Engage with Rigor"

    Science.gov (United States)

    Herd, A.

    2012-01-01

    As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a

  12. Rigorous solution to Bargmann-Wigner equation for integer spin

    CERN Document Server

    Huang Shi Zhong; Wu Ning; Zheng Zhi Peng

    2002-01-01

    A rigorous method is developed to solve the Bargamann-Wigner equation for arbitrary integer spin in coordinate representation in a step by step way. The Bargmann-Wigner equation is first transformed to a form easier to solve, the new equations are then solved rigorously in coordinate representation, and the wave functions in a closed form are thus derived

  13. Using grounded theory as a method for rigorously reviewing literature

    NARCIS (Netherlands)

    Wolfswinkel, J.; Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.

    2013-01-01

    This paper offers guidance to conducting a rigorous literature review. We present this in the form of a five-stage process in which we use Grounded Theory as a method. We first probe the guidelines explicated by Webster and Watson, and then we show the added value of Grounded Theory for rigorously

  14. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    Science.gov (United States)

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  15. Promoting Rigorous Validation Practice: An Applied Perspective

    Science.gov (United States)

    Mattern, Krista D.; Kobrin, Jennifer L.; Camara, Wayne J.

    2012-01-01

    As researchers at a testing organization concerned with the appropriate uses and validity evidence for our assessments, we provide an applied perspective related to the issues raised in the focus article. Newton's proposal for elaborating the consensus definition of validity is offered with the intention to reduce the risks of inadequate…

  16. The Revista Scientific

    Directory of Open Access Journals (Sweden)

    Oscar Antonio Martínez Molina

    2017-02-01

    Full Text Available The Revista Scientific aims to publish quality papers that include the perspective of analysis in educational settings. Together with www.indtec.com.ve, this electronic publication aims to promote and disseminate, with seriousness and rigor, the academic production in this field. Editorial of the new stage Revista Scientific was created with the aim of constituting a reference space for scientific research in the field of research analysis that is carried out within the universities in Latin America, once the distribution list hosted on the INDTEC platform (http://www.indtec.com.ve is consolidated as a space for dissemination and development of new ideas and initiatives. The first presentation of INDTEC Magazine was held in August 2016 in Venezuela. Thanks to the support of the INDTEC platform, SCIENTIFIC Magazine has been able to develop from the cooperative work of the people who make up its Editorial Committee, Academic Committee and Scientific Committee in Electronic Edition, and of the referees of each one of the numbers. Part of the success is due to the motivation of its co-editors and excellent professionals from different parts of the world: Argentina, Belgium, Colombia, Cuba, Ecuador, Spain, Mexico, Venezuela, which form the various committees, with enthusiasm and joy participating in this project (whose organizational structure is presented in this edition and continues in increcendo. Also, the strategy adopted to edit a monographic number from the various events organized in the framework of the universities, has contributed to provide SCIENTIFIC with a point value speaker of intellectual progress in the field of education. SCIENTIFIC Magazine is currently indexed in ISI, International Scientific Indexing, Dubai - UAE; ROAD, the Directory of Open Access Scholarly Resources (ISSN International Center, France; REVENCYT-ULA, Venezuela; Google Scholar (Google Scholar, International Index; Published in Calaméo; ISSUU; Academia

  17. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  18. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  19. Monitoring muscle optical scattering properties during rigor mortis

    Science.gov (United States)

    Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.

    2007-09-01

    Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.

  20. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  1. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    OpenAIRE

    K. Di; Y. Liu; B. Liu; M. Peng

    2012-01-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D c...

  2. The 10 basic requirements for a scientific paper reporting antioxidant, antimutagenic or anticarcinogenic potential of test substances in in vitro experiments and animal studies in vivo

    DEFF Research Database (Denmark)

    Verhagen, H.; Aruoma, O.I.; van Delft, J.H.M.

    2003-01-01

    There is increasing evidence that chemicals/test substances cannot only have adverse effects, but that there are many substances that can (also) have a beneficial effect on health. As this journal regularly publishes papers in this area and has every intention in continuing to do so in the near......, provided they can be justified on scientific grounds. The 10 basic requirements for a scientific paper reporting antioxidant, antimutagenic or anticarcinogenic potential of test substances in in vitro experiments and animal studies in vivo concern the following areas: (1) Hypothesis-driven study design; (2......) The nature of the test substance; (3) Valid and invalid test systems; (4) The selection of dose levels and gender; (5) Reversal of the effects induced by oxidants, carcinogens and mutagens; (6) Route of administration; (7) Number and validity of test variables; (8) Repeatability and reproducibility; (9...

  3. Tenderness of pre- and post rigor lamb longissimus muscle.

    Science.gov (United States)

    Geesink, Geert; Sujang, Sadi; Koohmaraie, Mohammad

    2011-08-01

    Lamb longissimus muscle (n=6) sections were cooked at different times post mortem (prerigor, at rigor, 1dayp.m., and 7 days p.m.) using two cooking methods. Using a boiling waterbath, samples were either cooked to a core temperature of 70 °C or boiled for 3h. The latter method was meant to reflect the traditional cooking method employed in countries where preparation of prerigor meat is practiced. The time postmortem at which the meat was prepared had a large effect on the tenderness (shear force) of the meat (PCooking prerigor and at rigor meat to 70 °C resulted in higher shear force values than their post rigor counterparts at 1 and 7 days p.m. (9.4 and 9.6 vs. 7.2 and 3.7 kg, respectively). The differences in tenderness between the treatment groups could be largely explained by a difference in contraction status of the meat after cooking and the effect of ageing on tenderness. Cooking pre and at rigor meat resulted in severe muscle contraction as evidenced by the differences in sarcomere length of the cooked samples. Mean sarcomere lengths in the pre and at rigor samples ranged from 1.05 to 1.20 μm. The mean sarcomere length in the post rigor samples was 1.44 μm. Cooking for 3 h at 100 °C did improve the tenderness of pre and at rigor prepared meat as compared to cooking to 70 °C, but not to the extent that ageing did. It is concluded that additional intervention methods are needed to improve the tenderness of prerigor cooked meat. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Scientifically defensible fish conservation and recovery plans: Addressing diffuse threats and developing rigorous adaptive management plans

    Science.gov (United States)

    Maas-Hebner, Kathleen G.; Schreck, Carl B.; Hughes, Robert M.; Yeakley, Alan; Molina, Nancy

    2016-01-01

    We discuss the importance of addressing diffuse threats to long-term species and habitat viability in fish conservation and recovery planning. In the Pacific Northwest, USA, salmonid management plans have typically focused on degraded freshwater habitat, dams, fish passage, harvest rates, and hatchery releases. However, such plans inadequately address threats related to human population and economic growth, intra- and interspecific competition, and changes in climate, ocean, and estuarine conditions. Based on reviews conducted on eight conservation and/or recovery plans, we found that though threats resulting from such changes are difficult to model and/or predict, they are especially important for wide-ranging diadromous species. Adaptive management is also a critical but often inadequately constructed component of those plans. Adaptive management should be designed to respond to evolving knowledge about the fish and their supporting ecosystems; if done properly, it should help improve conservation efforts by decreasing uncertainty regarding known and diffuse threats. We conclude with a general call for environmental managers and planners to reinvigorate the adaptive management process in future management plans, including more explicitly identifying critical uncertainties, implementing monitoring programs to reduce those uncertainties, and explicitly stating what management actions will occur when pre-identified trigger points are reached.

  5. Geometric knowledge and scientific rigor of digital photography: the case of nodal photography

    Directory of Open Access Journals (Sweden)

    Marco Carpiceci

    2013-10-01

    Full Text Available In the past the formation of the photographic image was almost exclusively delegated to a process of shooting, developing and printing or projecting. Today the picture has so many possibilities that it is difficult to delineate a clear and focused operative boundary. In digital photography, every step offers the opportunity of transformation. However the multiple possibilities offered by digital photography implya required knowledge of all those activities in which the automatisms can prevent user from the realization processes control. As emblem of general cognitive problem, we analyze a significant application field that we define “nodal photography”. It is based on a technique produced from the development of electronics and computer, and that encompasses many aspects of technological innovation we are experiencing.

  6. Building an Evidence Base to Inform Interventions for Pregnant and Parenting Adolescents: A Call for Rigorous Evaluation

    Science.gov (United States)

    Burrus, Barri B.; Scott, Alicia Richmond

    2012-01-01

    Adolescent parents and their children are at increased risk for adverse short- and long-term health and social outcomes. Effective interventions are needed to support these young families. We studied the evidence base and found a dearth of rigorously evaluated programs. Strategies from successful interventions are needed to inform both intervention design and policies affecting these adolescents. The lack of rigorous evaluations may be attributable to inadequate emphasis on and sufficient funding for evaluation, as well as to challenges encountered by program evaluators working with this population. More rigorous program evaluations are urgently needed to provide scientifically sound guidance for programming and policy decisions. Evaluation lessons learned have implications for other vulnerable populations. PMID:22897541

  7. Estimation of the breaking of rigor mortis by myotonometry.

    Science.gov (United States)

    Vain, A; Kauppila, R; Vuori, E

    1996-05-31

    Myotonometry was used to detect breaking of rigor mortis. The myotonometer is a new instrument which measures the decaying oscillations of a muscle after a brief mechanical impact. The method gives two numerical parameters for rigor mortis, namely the period and decrement of the oscillations, both of which depend on the time period elapsed after death. In the case of breaking the rigor mortis by muscle lengthening, both the oscillation period and decrement decreased, whereas, shortening the muscle caused the opposite changes. Fourteen h after breaking the stiffness characteristics of the right and left m. biceps brachii, or oscillation periods, were assimilated. However, the values for decrement of the muscle, reflecting the dissipation of mechanical energy, maintained their differences.

  8. Physiological studies of muscle rigor mortis in the fowl

    International Nuclear Information System (INIS)

    Nakahira, S.; Kaneko, K.; Tanaka, K.

    1990-01-01

    A simple system was developed for continuous measurement of muscle contraction during nor mortis. Longitudinal muscle strips dissected from the Peroneus Longus were suspended in a plastic tube containing liquid paraffin. Mechanical activity was transmitted to a strain-gauge transducer which is connected to a potentiometric pen-recorder. At the onset of measurement 1.2g was loaded on the muscle strip. This model was used to study the muscle response to various treatments during nor mortis. All measurements were carried out under the anaerobic condition at 17°C, except otherwise stated. 1. The present system was found to be quite useful for continuous measurement of muscle rigor course. 2. Muscle contraction under the anaerobic condition at 17°C reached a peak about 2 hours after the onset of measurement and thereafter it relaxed at a slow rate. In contrast, the aerobic condition under a high humidity resulted in a strong rigor, about three times stronger than that in the anaerobic condition. 3. Ultrasonic treatment (37, 000-47, 000Hz) at 25°C for 10 minutes resulted in a moderate muscle rigor. 4. Treatment of muscle strip with 2mM EGTA at 30°C for 30 minutes led to a relaxation of the muscle. 5. The muscle from the birds killed during anesthesia with pentobarbital sodium resulted in a slow rate of rigor, whereas the birds killed one day after hypophysectomy led to a quick muscle rigor as seen in intact controls. 6. A slight muscle rigor was observed when muscle strip was placed in a refrigerator at 0°C for 18.5 hours and thereafter temperature was kept at 17°C. (author)

  9. RIGOROUS GEOREFERENCING OF ALSAT-2A PANCHROMATIC AND MULTISPECTRAL IMAGERY

    Directory of Open Access Journals (Sweden)

    I. Boukerch

    2013-04-01

    Full Text Available The exploitation of the full geometric capabilities of the High-Resolution Satellite Imagery (HRSI, require the development of an appropriate sensor orientation model. Several authors studied this problem; generally we have two categories of geometric models: physical and empirical models. Based on the analysis of the metadata provided with ALSAT-2A, a rigorous pushbroom camera model can be developed. This model has been successfully applied to many very high resolution imagery systems. The relation between the image and ground coordinates by the time dependant collinearity involving many coordinates systems has been tested. The interior orientation parameters must be integrated in the model, the interior parameters can be estimated from the viewing angles corresponding to the pointing directions of any detector, these values are derived from cubic polynomials provided in the metadata. The developed model integrates all the necessary elements with 33 unknown. All the approximate values of the 33 unknowns parameters may be derived from the informations contained in the metadata files provided with the imagery technical specifications or they are simply fixed to zero, so the condition equation is linearized and solved using SVD in a least square sense in order to correct the initial values using a suitable number of well-distributed GCPs. Using Alsat-2A images over the town of Toulouse in the south west of France, three experiments are done. The first is about 2D accuracy analysis using several sets of parameters. The second is about GCPs number and distribution. The third experiment is about georeferencing multispectral image by applying the model calculated from panchromatic image.

  10. Rigorous covariance propagation of geoid errors to geodetic MDT estimates

    Science.gov (United States)

    Pail, R.; Albertella, A.; Fecher, T.; Savcenko, R.

    2012-04-01

    The mean dynamic topography (MDT) is defined as the difference between the mean sea surface (MSS) derived from satellite altimetry, averaged over several years, and the static geoid. Assuming geostrophic conditions, from the MDT the ocean surface velocities as important component of global ocean circulation can be derived from it. Due to the availability of GOCE gravity field models, for the very first time MDT can now be derived solely from satellite observations (altimetry and gravity) down to spatial length-scales of 100 km and even below. Global gravity field models, parameterized in terms of spherical harmonic coefficients, are complemented by the full variance-covariance matrix (VCM). Therefore, for the geoid component a realistic statistical error estimate is available, while the error description of the altimetric component is still an open issue and is, if at all, attacked empirically. In this study we make the attempt to perform, based on the full gravity VCM, rigorous error propagation to derived geostrophic surface velocities, thus also considering all correlations. For the definition of the static geoid we use the third release of the time-wise GOCE model, as well as the satellite-only combination model GOCO03S. In detail, we will investigate the velocity errors resulting from the geoid component in dependence of the harmonic degree, and the impact of using/no using covariances on the MDT errors and its correlations. When deriving an MDT, it is spectrally filtered to a certain maximum degree, which is usually driven by the signal content of the geoid model, by applying isotropic or non-isotropic filters. Since this filtering is acting also on the geoid component, the consistent integration of this filter process into the covariance propagation shall be performed, and its impact shall be quantified. The study will be performed for MDT estimates in specific test areas of particular oceanographic interest.

  11. Abstracts of 3. International scientific-practical conference 'Semipalatinsk Test Site. Radiation Legacy and Non-proliferation Issues'

    International Nuclear Information System (INIS)

    2008-01-01

    The Conference gathered representatives of more than 25 countries and international organizations. In the Conference among with actual problems of current environment conditions in Kazakhstan, perspective trends in the field of radiation protection, radio-ecological and radiobiological research and issues of international co-operation in support of non-proliferation regime, other advanced scientific projects were considered [ru

  12. Reconciling the Rigor-Relevance Dilemma in Intellectual Capital Research

    Science.gov (United States)

    Andriessen, Daniel

    2004-01-01

    This paper raises the issue of research methodology for intellectual capital and other types of management research by focusing on the dilemma of rigour versus relevance. The more traditional explanatory approach to research often leads to rigorous results that are not of much help to solve practical problems. This paper describes an alternative…

  13. Paper 3: Content and Rigor of Algebra Credit Recovery Courses

    Science.gov (United States)

    Walters, Kirk; Stachel, Suzanne

    2014-01-01

    This paper describes the content, organization and rigor of the f2f and online summer algebra courses that were delivered in summers 2011 and 2012. Examining the content of both types of courses is important because research suggests that algebra courses with certain features may be better than others in promoting success for struggling students.…

  14. A rigorous treatment of uncertainty quantification for Silicon damage metrics

    International Nuclear Information System (INIS)

    Griffin, P.

    2016-01-01

    These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)

  15. Effects of post mortem temperature on rigor tension, shortening and ...

    African Journals Online (AJOL)

    Fully developed rigor mortis in muscle is characterised by maximum loss of extensibility. The course of post mortem changes in ostrich muscle was studied by following isometric tension, shortening and change in pH during the first 24 h post mortem within muscle strips from the muscularis gastrocnemius, pars interna at ...

  16. Characterization of rigor mortis of longissimus dorsi and triceps ...

    African Journals Online (AJOL)

    24 h) of the longissimus dorsi (LD) and triceps brachii (TB) muscles as well as the shear force (meat tenderness) and colour were evaluated, aiming at characterizing the rigor mortis in the meat during industrial processing. Data statistic treatment demonstrated that carcass temperature and pH decreased gradually during ...

  17. Rigor, vigor, and the study of health disparities.

    Science.gov (United States)

    Adler, Nancy; Bush, Nicole R; Pantell, Matthew S

    2012-10-16

    Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents' SES on their children's health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between "rigor" and "vigor" in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities.

  18. A rigorous proof for the Landauer-Büttiker formula

    DEFF Research Database (Denmark)

    Cornean, Horia Decebal; Jensen, Arne; Moldoveanu, V.

    Recently, Avron et al. shed new light on the question of quantum transport in mesoscopic samples coupled to particle reservoirs by semi-infinite leads. They rigorously treat the case when the sample undergoes an adiabatic evolution thus generating a current through th leads, and prove the so call...

  19. Rigorous simulation: a tool to enhance decision making

    Energy Technology Data Exchange (ETDEWEB)

    Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)

    2012-07-01

    The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct

  20. SALTON SEA SCIENTIFIC DRILLING PROJECT: SCIENTIFIC PROGRAM.

    Science.gov (United States)

    Sass, J.H.; Elders, W.A.

    1986-01-01

    The Salton Sea Scientific Drilling Project, was spudded on 24 October 1985, and reached a total depth of 10,564 ft. (3. 2 km) on 17 March 1986. There followed a period of logging, a flow test, and downhole scientific measurements. The scientific goals were integrated smoothly with the engineering and economic objectives of the program and the ideal of 'science driving the drill' in continental scientific drilling projects was achieved in large measure. The principal scientific goals of the project were to study the physical and chemical processes involved in an active, magmatically driven hydrothermal system. To facilitate these studies, high priority was attached to four areas of sample and data collection, namely: (1) core and cuttings, (2) formation fluids, (3) geophysical logging, and (4) downhole physical measurements, particularly temperatures and pressures.

  1. Scientific impact: opportunity and necessity.

    Science.gov (United States)

    Cohen, Marlene Z; Alexander, Gregory L; Wyman, Jean F; Fahrenwald, Nancy L; Porock, Davina; Wurzbach, Mary E; Rawl, Susan M; Conn, Vicki S

    2010-08-01

    Recent National Institutes of Health changes have focused attention on the potential scientific impact of research projects. Research with the excellent potential to change subsequent science or health care practice may have high scientific impact. Only rigorous studies that address highly significant problems can generate change. Studies with high impact may stimulate new research approaches by changing understanding of a phenomenon, informing theory development, or creating new research methods that allow a field of science to move forward. Research with high impact can transition health care to more effective and efficient approaches. Studies with high impact may propel new policy developments. Research with high scientific impact typically has both immediate and sustained influence on the field of study. The article includes ideas to articulate potential scientific impact in grant applications as well as possible dissemination strategies to enlarge the impact of completed projects.

  2. Einstein's Theory A Rigorous Introduction for the Mathematically Untrained

    CERN Document Server

    Grøn, Øyvind

    2011-01-01

    This book provides an introduction to the theory of relativity and the mathematics used in its processes. Three elements of the book make it stand apart from previously published books on the theory of relativity. First, the book starts at a lower mathematical level than standard books with tensor calculus of sufficient maturity to make it possible to give detailed calculations of relativistic predictions of practical experiments. Self-contained introductions are given, for example vector calculus, differential calculus and integrations. Second, in-between calculations have been included, making it possible for the non-technical reader to follow step-by-step calculations. Thirdly, the conceptual development is gradual and rigorous in order to provide the inexperienced reader with a philosophically satisfying understanding of the theory.  Einstein's Theory: A Rigorous Introduction for the Mathematically Untrained aims to provide the reader with a sound conceptual understanding of both the special and genera...

  3. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1986-01-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first- and second-order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first-order formulation satisfies the conditions of the Hille--Yosida theorem. A foundation is laid thereby within which the domains associated with the first- and second-order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  4. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1985-05-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first and second order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first order formulation satisfies the conditions of the Hille-Yosida theorem. A foundation is laid thereby within which the domains associated with the first and second order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  5. Rigorous results on measuring the quark charge below color threshold

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1979-01-01

    Rigorous theorems are presented showing that contributions from a color nonsinglet component of the current to matrix elements of a second order electromagnetic transition are suppressed by factors inversely proportional to the energy of the color threshold. Parton models which obtain matrix elements proportional to the color average of the square of the quark charge are shown to neglect terms of the same order of magnitude as terms kept. (author)

  6. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  7. Striation Patterns of Ox Muscle in Rigor Mortis

    Science.gov (United States)

    Locker, Ronald H.

    1959-01-01

    Ox muscle in rigor mortis offers a selection of myofibrils fixed at varying degrees of contraction from sarcomere lengths of 3.7 to 0.7 µ. A study of this material by phase contrast and electron microscopy has revealed four distinct successive patterns of contraction, including besides the familiar relaxed and contracture patterns, two intermediate types (2.4 to 1.9 µ, 1.8 to 1.5 µ) not previously well described. PMID:14417790

  8. Rigorous Analysis of a Randomised Number Field Sieve

    OpenAIRE

    Lee, Jonathan; Venkatesan, Ramarathnam

    2018-01-01

    Factorisation of integers $n$ is of number theoretic and cryptographic significance. The Number Field Sieve (NFS) introduced circa 1990, is still the state of the art algorithm, but no rigorous proof that it halts or generates relationships is known. We propose and analyse an explicitly randomised variant. For each $n$, we show that these randomised variants of the NFS and Coppersmith's multiple polynomial sieve find congruences of squares in expected times matching the best-known heuristic e...

  9. Reciprocity relations in transmission electron microscopy: A rigorous derivation.

    Science.gov (United States)

    Krause, Florian F; Rosenauer, Andreas

    2017-01-01

    A concise derivation of the principle of reciprocity applied to realistic transmission electron microscopy setups is presented making use of the multislice formalism. The equivalence of images acquired in conventional and scanning mode is thereby rigorously shown. The conditions for the applicability of the found reciprocity relations is discussed. Furthermore the positions of apertures in relation to the corresponding lenses are considered, a subject which scarcely has been addressed in previous publications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.

    Science.gov (United States)

    Morse, Janice M

    2015-09-01

    Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.

  11. Development of ASTM Standard for SiC-SiC Joint Testing Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Jacobsen, George [General Atomics, San Diego, CA (United States); Back, Christina [General Atomics, San Diego, CA (United States)

    2015-10-30

    As the nuclear industry moves to advanced ceramic based materials for cladding and core structural materials for a variety of advanced reactors, new standards and test methods are required for material development and licensing purposes. For example, General Atomics (GA) is actively developing silicon carbide (SiC) based composite cladding (SiC-SiC) for its Energy Multiplier Module (EM2), a high efficiency gas cooled fast reactor. Through DOE funding via the advanced reactor concept program, GA developed a new test method for the nominal joint strength of an endplug sealed to advanced ceramic tubes, Fig. 1-1, at ambient and elevated temperatures called the endplug pushout (EPPO) test. This test utilizes widely available universal mechanical testers coupled with clam shell heaters, and specimen size is relatively small, making it a viable post irradiation test method. The culmination of this effort was a draft of an ASTM test standard that will be submitted for approval to the ASTM C28 ceramic committee. Once the standard has been vetted by the ceramics test community, an industry wide standard methodology to test joined tubular ceramic components will be available for the entire nuclear materials community.

  12. Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.

    Science.gov (United States)

    Plant, D R; Lynch, G S

    2001-09-01

    1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.

  13. On the Possibility of a Scientific Theory of Scientific Method.

    Science.gov (United States)

    Nola, Robert

    1999-01-01

    Discusses the philosophical strengths and weaknesses of Laudan's normative naturalism, which understands the principles of scientific method to be akin to scientific hypotheses, and therefore open to test like any principle of science. Contains 19 references. (Author/WRM)

  14. Evaluation of the scientific underpinnings for identifying estrogenic chemicals in non-mammalian taxa using mammalian test systems

    Science.gov (United States)

    A major challenge in chemical risk assessment is extrapolation of toxicity data from tested to untested species. Successful cross-species extrapolation involves understanding similarities and differences in toxicokinetic and toxicodynamic processes among species. Herein we consi...

  15. Depth and breadth: Bridging the gap between scientific inquiry and high-stakes testing with diverse junior high school students

    Science.gov (United States)

    Kang, Jee Sun Emily

    This study explored how inquiry-based teaching and learning processes occurred in two teachers' diverse 8th grade Physical Science classrooms in a Program Improvement junior high school within the context of high-stakes standardized testing. Instructors for the courses examined included not only the two 8th grade science teachers, but also graduate fellows from a nearby university. Research was drawn from inquiry-based instruction in science education, the achievement gap, and the high stakes testing movement, as well as situated learning theory to understand how opportunities for inquiry were negotiated within the diverse classroom context. Transcripts of taped class sessions; student work samples; interviews of teachers and students; and scores from the California Standards Test in science were collected and analyzed. Findings indicated that the teachers provided structured inquiry in order to support their students in learning about forces and to prepare them for the standardized test. Teachers also supported students in generating evidence-based explanations, connecting inquiry-based investigations with content on forces, proficiently using science vocabulary, and connecting concepts about forces to their daily lives. Findings from classroom data revealed constraints to student learning: students' limited language proficiency, peer counter culture, and limited time. Supports were evidenced as well: graduate fellows' support during investigations, teachers' guided questioning, standardized test preparation, literacy support, and home-school connections. There was no statistical difference in achievement on the Forces Unit test or science standardized test between classes with graduate fellows and without fellows. There was also no statistical difference in student performance between the two teachers' classrooms, even though their teaching styles were very different. However, there was a strong correlation between students' achievement on the chapter test and

  16. The effect of rigor mortis on the passage of erythrocytes and fluid through the myocardium of isolated dog hearts.

    Science.gov (United States)

    Nevalainen, T J; Gavin, J B; Seelye, R N; Whitehouse, S; Donnell, M

    1978-07-01

    The effect of normal and artificially induced rigor mortis on the vascular passage of erythrocytes and fluid through isolated dog hearts was studied. Increased rigidity of 6-mm thick transmural sections through the centre of the posterior papillary muscle was used as an indication of rigor. The perfusibility of the myocardium was tested by injecting 10 ml of 1% sodium fluorescein in Hanks solution into the circumflex branch of the left coronary artery. In prerigor hearts (20 minute incubation) fluorescein perfused the myocardium evenly whether or not it was preceded by an injection of 10 ml of heparinized dog blood. Rigor mortis developed in all hearts after 90 minutes incubation or within 20 minutes of perfusing the heart with 50 ml of 5 mM iodoacetate in Hanks solution. Fluorescein injected into hearts in rigor did not enter the posterior papillary muscle and adjacent subendocardium whether or not it was preceded by heparinized blood. Thus the vascular occlusion caused by rigor in the dog heart appears to be so effective that it prevents flow into the subendocardium of small soluble ions such as fluorescein.

  17. Sonoelasticity to monitor mechanical changes during rigor and ageing.

    Science.gov (United States)

    Ayadi, A; Culioli, J; Abouelkaram, S

    2007-06-01

    We propose the use of sonoelasticity as a non-destructive method to monitor changes in the resistance of muscle fibres, unaffected by connective tissue. Vibrations were applied at low frequency to induce oscillations in soft tissues and an ultrasound transducer was used to detect the motions. The experiments were carried out on the M. biceps femoris muscles of three beef cattle. In addition to the sonoelasticity measurements, the changes in meat during rigor and ageing were followed by measurements of both the mechanical resistance of myofibres and pH. The variations of mechanical resistance and pH were compared to those of the sonoelastic variables (velocity and attenuation) at two frequencies. The relationships between pH and velocity or attenuation and between the velocity or attenuation and the stress at 20% deformation were highly correlated. We concluded that sonoelasticity is a non-destructive method that can be used to monitor mechanical changes in muscle fibers during rigor-mortis and ageing.

  18. Rigorous quantum limits on monitoring free masses and harmonic oscillators

    Science.gov (United States)

    Roy, S. M.

    2018-03-01

    There are heuristic arguments proposing that the accuracy of monitoring position of a free mass m is limited by the standard quantum limit (SQL): σ2( X (t ) ) ≥σ2( X (0 ) ) +(t2/m2) σ2( P (0 ) ) ≥ℏ t /m , where σ2( X (t ) ) and σ2( P (t ) ) denote variances of the Heisenberg representation position and momentum operators. Yuen [Phys. Rev. Lett. 51, 719 (1983), 10.1103/PhysRevLett.51.719] discovered that there are contractive states for which this result is incorrect. Here I prove universally valid rigorous quantum limits (RQL), viz. rigorous upper and lower bounds on σ2( X (t ) ) in terms of σ2( X (0 ) ) and σ2( P (0 ) ) , given by Eq. (12) for a free mass and by Eq. (36) for an oscillator. I also obtain the maximally contractive and maximally expanding states which saturate the RQL, and use the contractive states to set up an Ozawa-type measurement theory with accuracies respecting the RQL but beating the standard quantum limit. The contractive states for oscillators improve on the Schrödinger coherent states of constant variance and may be useful for gravitational wave detection and optical communication.

  19. Experimental evaluation of rigor mortis IX. The influence of the breaking (mechanical solution) on the development of rigor mortis.

    Science.gov (United States)

    Krompecher, Thomas; Gilles, André; Brandt-Casadevall, Conception; Mangin, Patrice

    2008-04-07

    Objective measurements were carried out to study the possible re-establishment of rigor mortis on rats after "breaking" (mechanical solution). Our experiments showed that: *Cadaveric rigidity can re-establish after breaking. *A significant rigidity can reappear if the breaking occurs before the process is complete. *Rigidity will be considerably weaker after the breaking. *The time course of the intensity does not change in comparison to the controls: --the re-establishment begins immediately after the breaking; --maximal values are reached at the same time as in the controls; --the course of the resolution is the same as in the controls.

  20. The interpretation of forensic biochemical expert test made in human body fluids: scientific - legal analysis in the research on sexual offenses

    International Nuclear Information System (INIS)

    Chaves Carballo, Diana

    2014-01-01

    The contributions of science and technology have covered the whole of human life, and relationships of coexistence are even found in the various disciplines of knowledge through legal forensics. Therefore, it is increasingly imperative that the law enforcement agents are interdisciplinary professionals, with knowledge beyond the legal knowledge to enable them make the most of the scientific knowledge in judicial proceedings. Among the natural sciences applied to right, forensic biochemistry has contributed an extremely relevant test for the investigation of various sexual offenses, much has been so, that the Organismo de Investigacion Judicial of Costa Rica has in its Departamento de Laboratorios de Ciencias Forenses with specialized sections in this discipline. A diversity of skills are performed of presumptive and confirmatory character for the presence of biological fluids, sexually transmitted diseases and identification of DNA by genetic markers. Updated information is given with respect to the correct interpretation of forensic biochemical expertises achievable for identification of semen, blood and human saliva in the investigation of sexual offenses. A scientific and legal language is used allowing the most of this information in the criminal process. The main objective has been to interpret, legal and scientifically, forensic biochemical expert evidence performed in human body fluids during the investigation of sexual offenses. A legal, doctrinal and scientific review is presented with compilation of related jurisprudence and criminology reports analysis of Seccion de Bioquimica of the Departamento de Laboratorios Forenses of the Organismo de Investigacion Juridica issued during the investigation of sexual offenses. Two types of attainable skills have existed for the identification of biological fluids, each with a different binding. In addition, it has been clear, due to the lexicon employed when making a forensic biochemist opinion, that to make a proper

  1. Scientific Misconduct.

    Science.gov (United States)

    Goodstein, David

    2002-01-01

    Explores scientific fraud, asserting that while few scientists actually falsify results, the field has become so competitive that many are misbehaving in other ways; an example would be unreasonable criticism by anonymous peer reviewers. (EV)

  2. Radon exhalation of cementitious materials made with coal fly ash: Part 1 - scientific background and testing of the cement and fly ash emanation

    International Nuclear Information System (INIS)

    Kovler, K.; Perevalov, A.; Steiner, V.; Metzger, L.A.

    2005-01-01

    Increased interest in measuring radionuclides and radon concentrations in fly ash, cement and other components of building products is due to the concern of health hazards of naturally occurring radioactive materials (NORM). The current work focuses on studying the influence of fly ash (FA) on radon-exhalation rate (radon flux) from cementitious materials. The tests were carried out on cement paste specimens with different FA contents. The first part of the paper presents the scientific background and describes the experiments, which we designed for testing the radon emanation of the raw materials used in the preparation of the cement-FA pastes. It is found that despite the higher 226 Ra content in FA (more than 3 times, compared with Portland cement) the radon emanation is significantly lower in FA (7.65% for cement vs. 0.52% only for FA)

  3. Preliminary Feasibility, Design, and Hazard Analysis of a Boiling Water Test Loop Within the Idaho National Laboratory Advanced Test Reactor National Scientific User Facility

    International Nuclear Information System (INIS)

    Gerstner, Douglas M.

    2009-01-01

    The Advanced Test Reactor (ATR) is a pressurized light-water reactor with a design thermal power of 250 MW. The principal function of the ATR is to provide a high neutron flux for testing reactor fuels and other materials. The ATR and its support facilities are located at the Idaho National Laboratory (INL). A Boiling Water Test Loop (BWTL) is being designed for one of the irradiation test positions within the. The objective of the new loop will be to simulate boiling water reactor (BWR) conditions to support clad corrosion and related reactor material testing. Further it will accommodate power ramping tests of candidate high burn-up fuels and fuel pins/rods for the commercial BWR utilities. The BWTL will be much like the pressurized water loops already in service in 5 of the 9 'flux traps' (region of enhanced neutron flux) in the ATR. The loop coolant will be isolated from the primary coolant system so that the loop's temperature, pressure, flow rate, and water chemistry can be independently controlled. This paper presents the proposed general design of the in-core and auxiliary BWTL systems; the preliminary results of the neutronics and thermal hydraulics analyses; and the preliminary hazard analysis for safe normal and transient BWTL and ATR operation

  4. Parent Management Training-Oregon Model: Adapting Intervention with Rigorous Research.

    Science.gov (United States)

    Forgatch, Marion S; Kjøbli, John

    2016-09-01

    Parent Management Training-Oregon Model (PMTO(®) ) is a set of theory-based parenting programs with status as evidence-based treatments. PMTO has been rigorously tested in efficacy and effectiveness trials in different contexts, cultures, and formats. Parents, the presumed agents of change, learn core parenting practices, specifically skill encouragement, limit setting, monitoring/supervision, interpersonal problem solving, and positive involvement. The intervention effectively prevents and ameliorates children's behavior problems by replacing coercive interactions with positive parenting practices. Delivery format includes sessions with individual families in agencies or families' homes, parent groups, and web-based and telehealth communication. Mediational models have tested parenting practices as mechanisms of change for children's behavior and found support for the theory underlying PMTO programs. Moderating effects include children's age, maternal depression, and social disadvantage. The Norwegian PMTO implementation is presented as an example of how PMTO has been tailored to reach diverse populations as delivered by multiple systems of care throughout the nation. An implementation and research center in Oslo provides infrastructure and promotes collaboration between practitioners and researchers to conduct rigorous intervention research. Although evidence-based and tested within a wide array of contexts and populations, PMTO must continue to adapt to an ever-changing world. © 2016 Family Process Institute.

  5. Cyber warfare building the scientific foundation

    CERN Document Server

    Jajodia, Sushil; Subrahmanian, VS; Swarup, Vipin; Wang, Cliff

    2015-01-01

    This book features a wide spectrum of the latest computer science research relating to cyber warfare, including military and policy dimensions. It is the first book to explore the scientific foundation of cyber warfare and features research from the areas of artificial intelligence, game theory, programming languages, graph theory and more. The high-level approach and emphasis on scientific rigor provides insights on ways to improve cyber warfare defense worldwide. Cyber Warfare: Building the Scientific Foundation targets researchers and practitioners working in cyber security, especially gove

  6. Reframing Rigor: A Modern Look at Challenge and Support in Higher Education

    Science.gov (United States)

    Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.

    2018-01-01

    This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.

  7. EarthLabs Modules: Engaging Students In Extended, Rigorous Investigations Of The Ocean, Climate and Weather

    Science.gov (United States)

    Manley, J.; Chegwidden, D.; Mote, A. S.; Ledley, T. S.; Lynds, S. E.; Haddad, N.; Ellins, K.

    2016-02-01

    EarthLabs, envisioned as a national model for high school Earth or Environmental Science lab courses, is adaptable for both undergraduate middle school students. The collection includes ten online modules that combine to feature a global view of our planet as a dynamic, interconnected system, by engaging learners in extended investigations. EarthLabs support state and national guidelines, including the NGSS, for science content. Four modules directly guide students to discover vital aspects of the oceans while five other modules incorporate ocean sciences in order to complete an understanding of Earth's climate system. Students gain a broad perspective on the key role oceans play in fishing industry, droughts, coral reefs, hurricanes, the carbon cycle, as well as life on land and in the seas to drive our changing climate by interacting with scientific research data, manipulating satellite imagery, numerical data, computer visualizations, experiments, and video tutorials. Students explore Earth system processes and build quantitative skills that enable them to objectively evaluate scientific findings for themselves as they move through ordered sequences that guide the learning. As a robust collection, EarthLabs modules engage students in extended, rigorous investigations allowing a deeper understanding of the ocean, climate and weather. This presentation provides an overview of the ten curriculum modules that comprise the EarthLabs collection developed by TERC and found at http://serc.carleton.edu/earthlabs/index.html. Evaluation data on the effectiveness and use in secondary education classrooms will be summarized.

  8. Rigorous force field optimization principles based on statistical distance minimization

    Energy Technology Data Exchange (ETDEWEB)

    Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  9. From everyday communicative figurations to rigorous audience news repertoires

    DEFF Research Database (Denmark)

    Kobbernagel, Christian; Schrøder, Kim Christian

    2016-01-01

    In the last couple of decades there has been an unprecedented explosion of news media platforms and formats, as a succession of digital and social media have joined the ranks of legacy media. We live in a ‘hybrid media system’ (Chadwick, 2013), in which people build their cross-media news...... repertoires from the ensemble of old and new media available. This article presents an innovative mixed-method approach with considerable explanatory power to the exploration of patterns of news media consumption. This approach tailors Q-methodology in the direction of a qualitative study of news consumption......, in which a card sorting exercise serves to translate the participants’ news media preferences into a form that enables the researcher to undertake a rigorous factor-analytical construction of their news consumption repertoires. This interpretive, factor-analytical procedure, which results in the building...

  10. Rigorous Quantum Field Theory A Festschrift for Jacques Bros

    CERN Document Server

    Monvel, Anne Boutet; Iagolnitzer, Daniel; Moschella, Ugo

    2007-01-01

    Jacques Bros has greatly advanced our present understanding of rigorous quantum field theory through numerous fundamental contributions. This book arose from an international symposium held in honour of Jacques Bros on the occasion of his 70th birthday, at the Department of Theoretical Physics of the CEA in Saclay, France. The impact of the work of Jacques Bros is evident in several articles in this book. Quantum fields are regarded as genuine mathematical objects, whose various properties and relevant physical interpretations must be studied in a well-defined mathematical framework. The key topics in this volume include analytic structures of Quantum Field Theory (QFT), renormalization group methods, gauge QFT, stability properties and extension of the axiomatic framework, QFT on models of curved spacetimes, QFT on noncommutative Minkowski spacetime. Contributors: D. Bahns, M. Bertola, R. Brunetti, D. Buchholz, A. Connes, F. Corbetta, S. Doplicher, M. Dubois-Violette, M. Dütsch, H. Epstein, C.J. Fewster, K....

  11. Desarrollo constitucional, legal y jurisprudencia del principio de rigor subsidiario

    Directory of Open Access Journals (Sweden)

    Germán Eduardo Cifuentes Sandoval

    2013-09-01

    Full Text Available In colombia the environment state administration is in charge of environmental national system, SINA, SINA is made up of states entities that coexist beneath a mixed organization of centralization and decentralization. SINA decentralization express itself in a administrative and territorial level, and is waited that entities that function under this structure act in a coordinated way in order to reach suggested objectives in the environmental national politicy. To achieve the coordinated environmental administration through entities that define the SINA, the environmental legislation of Colombia has include three basic principles: 1. The principle of “armorial regional” 2. The principle of “gradationnormative” 3. The principle of “rigorsubsidiaries”. These principles belong to the article 63, law 99 of 1933, and even in the case of the two first, it is possible to find equivalents in other norms that integrate the Colombian legal system, it does not happen in that way with the “ rigor subsidiaries” because its elements are uniques of the environmental normativity and do not seem to be similar to those that make part of the principle of “ subsidiaridad” present in the article 288 of the politic constitution. The “ rigor subsidiaries” give to decentralizates entities certain type of special ability to modify the current environmental legislation to defend the local ecological patrimony. It is an administrative ability with a foundation in the decentralization autonomy that allows to take place of the reglamentary denied of the legislative power with the condition that the new normativity be more demanding that the one that belongs to the central level

  12. Rigorous patient-prosthesis matching of Perimount Magna aortic bioprosthesis.

    Science.gov (United States)

    Nakamura, Hiromasa; Yamaguchi, Hiroki; Takagaki, Masami; Kadowaki, Tasuku; Nakao, Tatsuya; Amano, Atsushi

    2015-03-01

    Severe patient-prosthesis mismatch, defined as effective orifice area index ≤0.65 cm(2) m(-2), has demonstrated poor long-term survival after aortic valve replacement. Reported rates of severe mismatch involving the Perimount Magna aortic bioprosthesis range from 4% to 20% in patients with a small annulus. Between June 2008 and August 2011, 251 patients (mean age 70.5 ± 10.2 years; mean body surface area 1.55 ± 0.19 m(2)) underwent aortic valve replacement with a Perimount Magna bioprosthesis, with or without concomitant procedures. We performed our procedure with rigorous patient-prosthesis matching to implant a valve appropriately sized to each patient, and carried out annular enlargement when a 19-mm valve did not fit. The bioprosthetic performance was evaluated by transthoracic echocardiography predischarge and at 1 and 2 years after surgery. Overall hospital mortality was 1.6%. Only 5 (2.0%) patients required annular enlargement. The mean follow-up period was 19.1 ± 10.7 months with a 98.4% completion rate. Predischarge data showed a mean effective orifice area index of 1.21 ± 0.20 cm(2) m(-2). Moderate mismatch, defined as effective orifice area index ≤0.85 cm(2) m(-2), developed in 4 (1.6%) patients. None developed severe mismatch. Data at 1 and 2 years showed only two cases of moderate mismatch; neither was severe. Rigorous patient-prosthesis matching maximized the performance of the Perimount Magna, and no severe mismatch resulted in this Japanese population of aortic valve replacement patients. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  13. Scientific communication

    Directory of Open Access Journals (Sweden)

    Aleksander Kobylarek

    2017-09-01

    Full Text Available The article tackles the problem of models of communication in science. The formal division of communication processes into oral and written does not resolve the problem of attitude. The author defines successful communication as a win-win game, based on the respect and equality of the partners, regardless of their position in the world of science. The core characteristics of the process of scientific communication are indicated , such as openness, fairness, support, and creation. The task of creating the right atmosphere for science communication belongs to moderators, who should not allow privilege and differentiation of position to affect scientific communication processes.

  14. Scientific millenarianism

    International Nuclear Information System (INIS)

    Weinberg, A.M.

    1997-01-01

    Today, for the first time, scientific concerns are seriously being addressed that span future times--hundreds, even thousands, or more years in the future. One is witnessing what the author calls scientific millenarianism. Are such concerns for the distant future exercises in futility, or are they real issues that, to the everlasting gratitude of future generations, this generation has identified, warned about and even suggested how to cope with in the distant future? Can the four potential catastrophes--bolide impact, CO 2 warming, radioactive wastes and thermonuclear war--be avoided by technical fixes, institutional responses, religion, or by doing nothing? These are the questions addressed in this paper

  15. Scientific meetings

    International Nuclear Information System (INIS)

    1973-01-01

    One of the main aims of the IAEA is to foster the exchange of scientific and technical information and one of the main ways of doing this is to convene international scientific meetings. They range from large international conferences bringing together several hundred scientists, smaller symposia attended by an average of 150 to 250 participants and seminars designed to instruct rather than inform, to smaller panels and study groups of 10 to 30 experts brought together to advise on a particular programme or to develop a set of regulations. The topics of these meetings cover every part of the Agency's activities and form a backbone of many of its programmes. (author)

  16. Listening into the Dark: An Essay Testing the Validity and Efficacy of Collaborative Developmental Action Inquiry for Describing and Encouraging Transformations of Self, Society, and Scientific Inquiry

    Directory of Open Access Journals (Sweden)

    William R. Torbert

    2013-06-01

    Full Text Available Collaborative Developmental Action Inquiry (CDAI is introduced as a meta-paradigmatic approach to social science and social action that encompasses seven other more familiar paradigms (e.g., Behaviorism, Empirical Positivism, and Postmodern Interpretivism and that triangulates among third-person, objectivity-seeking social scientific inquiry, second-person, transformational, mutuality-seeking political inquiry, and first-person, adult, spiritual inquiry and consciousness development in the emerging present. CDAI tests findings, not only against third-person criteria of validity as do quantitative, positivist studies and qualitative, interpretive studies, but also against first- and second-person criteria of validity, as well as criteria of efficacy in action. CDAI introduces the possibility of treating, not just formal third-person studies, but any and all activities in one’s daily life in an inquiring manner. The aim of this differently-scientific approach is not only theoretical, generalizable knowledge, but also knowledge that generates increasingly timely action in particular cases in the relationships that mean the most to the inquirer. To illustrate and explain why the CDAI approach can explain unusually high percentages of the variance in whether or not organizations actually transform, all three types of validity-testing are applied to a specific study of intended transformation in ten organizations. The ten organization study found that adding together the performance of each organization’s CEO and lead consultant pn a reliable, well-validated measure of developmental action-logic, predicted 59% of the variance, beyond the .01 level, in whether and how the organization transformed (as rated by three scorers who achieved between .90 and 1.0 reliability. The essay concludes with a comparison between the Empirical Positivist paradigm of inquiry and the Collaborative Developmental Action Inquiry paradigm.

  17. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    Science.gov (United States)

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Study Design Rigor in Animal-Experimental Research Published in Anesthesia Journals.

    Science.gov (United States)

    Hoerauf, Janine M; Moss, Angela F; Fernandez-Bustamante, Ana; Bartels, Karsten

    2018-01-01

    Lack of reproducibility of preclinical studies has been identified as an impediment for translation of basic mechanistic research into effective clinical therapies. Indeed, the National Institutes of Health has revised its grant application process to require more rigorous study design, including sample size calculations, blinding procedures, and randomization steps. We hypothesized that the reporting of such metrics of study design rigor has increased over time for animal-experimental research published in anesthesia journals. PubMed was searched for animal-experimental studies published in 2005, 2010, and 2015 in primarily English-language anesthesia journals. A total of 1466 publications were graded on the performance of sample size estimation, randomization, and blinding. Cochran-Armitage test was used to assess linear trends over time for the primary outcome of whether or not a metric was reported. Interrater agreement for each of the 3 metrics (power, randomization, and blinding) was assessed using the weighted κ coefficient in a 10% random sample of articles rerated by a second investigator blinded to the ratings of the first investigator. A total of 1466 manuscripts were analyzed. Reporting for all 3 metrics of experimental design rigor increased over time (2005 to 2010 to 2015): for power analysis, from 5% (27/516), to 12% (59/485), to 17% (77/465); for randomization, from 41% (213/516), to 50% (243/485), to 54% (253/465); and for blinding, from 26% (135/516), to 38% (186/485), to 47% (217/465). The weighted κ coefficients and 98.3% confidence interval indicate almost perfect agreement between the 2 raters beyond that which occurs by chance alone (power, 0.93 [0.85, 1.0], randomization, 0.91 [0.85, 0.98], and blinding, 0.90 [0.84, 0.96]). Our hypothesis that reported metrics of rigor in animal-experimental studies in anesthesia journals have increased during the past decade was confirmed. More consistent reporting, or explicit justification for absence

  19. Using Project Complexity Determinations to Establish Required Levels of Project Rigor

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Thomas D.

    2015-10-01

    This presentation discusses the project complexity determination process that was developed by National Security Technologies, LLC, for the U.S. Department of Energy, National Nuclear Security Administration Nevada Field Office for implementation at the Nevada National Security Site (NNSS). The complexity determination process was developed to address the diversity of NNSS project types, size, and complexity; to fill the need for one procedure but with provision for tailoring the level of rigor to the project type, size, and complexity; and to provide consistent, repeatable, effective application of project management processes across the enterprise; and to achieve higher levels of efficiency in project delivery. These needs are illustrated by the wide diversity of NNSS projects: Defense Experimentation, Global Security, weapons tests, military training areas, sensor development and testing, training in realistic environments, intelligence community support, sensor development, environmental restoration/waste management, and disposal of radioactive waste, among others.

  20. Memory sparing, fast scattering formalism for rigorous diffraction modeling

    Science.gov (United States)

    Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.

    2017-07-01

    The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.

  1. Rigorous Results for the Distribution of Money on Connected Graphs

    Science.gov (United States)

    Lanchier, Nicolas; Reed, Stephanie

    2018-05-01

    This paper is concerned with general spatially explicit versions of three stochastic models for the dynamics of money that have been introduced and studied numerically by statistical physicists: the uniform reshuffling model, the immediate exchange model and the model with saving propensity. All three models consist of systems of economical agents that consecutively engage in pairwise monetary transactions. Computer simulations performed in the physics literature suggest that, when the number of agents and the average amount of money per agent are large, the limiting distribution of money as time goes to infinity approaches the exponential distribution for the first model, the gamma distribution with shape parameter two for the second model and a distribution similar but not exactly equal to a gamma distribution whose shape parameter depends on the saving propensity for the third model. The main objective of this paper is to give rigorous proofs of these conjectures and also extend these conjectures to generalizations of the first two models and a variant of the third model that include local rather than global interactions, i.e., instead of choosing the two interacting agents uniformly at random from the system, the agents are located on the vertex set of a general connected graph and can only interact with their neighbors.

  2. Rigorous vector wave propagation for arbitrary flat media

    Science.gov (United States)

    Bos, Steven P.; Haffert, Sebastiaan Y.; Keller, Christoph U.

    2017-08-01

    Precise modelling of the (off-axis) point spread function (PSF) to identify geometrical and polarization aberrations is important for many optical systems. In order to characterise the PSF of the system in all Stokes parameters, an end-to-end simulation of the system has to be performed in which Maxwell's equations are rigorously solved. We present the first results of a python code that we are developing to perform multiscale end-to-end wave propagation simulations that include all relevant physics. Currently we can handle plane-parallel near- and far-field vector diffraction effects of propagating waves in homogeneous isotropic and anisotropic materials, refraction and reflection of flat parallel surfaces, interference effects in thin films and unpolarized light. We show that the code has a numerical precision on the order of 10-16 for non-absorbing isotropic and anisotropic materials. For absorbing materials the precision is on the order of 10-8. The capabilities of the code are demonstrated by simulating a converging beam reflecting from a flat aluminium mirror at normal incidence.

  3. Dynamics of harmonically-confined systems: Some rigorous results

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Zhigang, E-mail: zwu@physics.queensu.ca; Zaremba, Eugene, E-mail: zaremba@sparky.phy.queensu.ca

    2014-03-15

    In this paper we consider the dynamics of harmonically-confined atomic gases. We present various general results which are independent of particle statistics, interatomic interactions and dimensionality. Of particular interest is the response of the system to external perturbations which can be either static or dynamic in nature. We prove an extended Harmonic Potential Theorem which is useful in determining the damping of the centre of mass motion when the system is prepared initially in a highly nonequilibrium state. We also study the response of the gas to a dynamic external potential whose position is made to oscillate sinusoidally in a given direction. We show in this case that either the energy absorption rate or the centre of mass dynamics can serve as a probe of the optical conductivity of the system. -- Highlights: •We derive various rigorous results on the dynamics of harmonically-confined atomic gases. •We derive an extension of the Harmonic Potential Theorem. •We demonstrate the link between the energy absorption rate in a harmonically-confined system and the optical conductivity.

  4. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    Science.gov (United States)

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  5. PRO development: rigorous qualitative research as the crucial foundation.

    Science.gov (United States)

    Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-10-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.

  6. Rigorous time slicing approach to Feynman path integrals

    CERN Document Server

    Fujiwara, Daisuke

    2017-01-01

    This book proves that Feynman's original definition of the path integral actually converges to the fundamental solution of the Schrödinger equation at least in the short term if the potential is differentiable sufficiently many times and its derivatives of order equal to or higher than two are bounded. The semi-classical asymptotic formula up to the second term of the fundamental solution is also proved by a method different from that of Birkhoff. A bound of the remainder term is also proved. The Feynman path integral is a method of quantization using the Lagrangian function, whereas Schrödinger's quantization uses the Hamiltonian function. These two methods are believed to be equivalent. But equivalence is not fully proved mathematically, because, compared with Schrödinger's method, there is still much to be done concerning rigorous mathematical treatment of Feynman's method. Feynman himself defined a path integral as the limit of a sequence of integrals over finite-dimensional spaces which is obtained by...

  7. Upgrading geometry conceptual understanding and strategic competence through implementing rigorous mathematical thinking (RMT)

    Science.gov (United States)

    Nugraheni, Z.; Budiyono, B.; Slamet, I.

    2018-03-01

    To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.

  8. Can homeopathy withstand scientific testing?

    NARCIS (Netherlands)

    Sluijs, F.J. van

    2004-01-01

    What is the importance of homeopathy in veterinary practice? The television appearance of E.L. Ellinger (DVM) during the Foot and Mouth (FMD) crisis in 2001 gave an insight into the views held by veterinarians practicing homeopathy. And we can be confident these views were representative because

  9. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Prowell, Stacy J [ORNL; Symons, Christopher T [ORNL

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  10. Stochastic Geometry and Quantum Gravity: Some Rigorous Results

    Science.gov (United States)

    Zessin, H.

    The aim of these lectures is a short introduction into some recent developments in stochastic geometry which have one of its origins in simplicial gravity theory (see Regge Nuovo Cimento 19: 558-571, 1961). The aim is to define and construct rigorously point processes on spaces of Euclidean simplices in such a way that the configurations of these simplices are simplicial complexes. The main interest then is concentrated on their curvature properties. We illustrate certain basic ideas from a mathematical point of view. An excellent representation of this area can be found in Schneider and Weil (Stochastic and Integral Geometry, Springer, Berlin, 2008. German edition: Stochastische Geometrie, Teubner, 2000). In Ambjørn et al. (Quantum Geometry Cambridge University Press, Cambridge, 1997) you find a beautiful account from the physical point of view. More recent developments in this direction can be found in Ambjørn et al. ("Quantum gravity as sum over spacetimes", Lect. Notes Phys. 807. Springer, Heidelberg, 2010). After an informal axiomatic introduction into the conceptual foundations of Regge's approach the first lecture recalls the concepts and notations used. It presents the fundamental zero-infinity law of stochastic geometry and the construction of cluster processes based on it. The second lecture presents the main mathematical object, i.e. Poisson-Delaunay surfaces possessing an intrinsic random metric structure. The third and fourth lectures discuss their ergodic behaviour and present the two-dimensional Regge model of pure simplicial quantum gravity. We terminate with the formulation of basic open problems. Proofs are given in detail only in a few cases. In general the main ideas are developed. Sufficiently complete references are given.

  11. A rigorous derivation of gravitational self-force

    International Nuclear Information System (INIS)

    Gralla, Samuel E; Wald, Robert M

    2008-01-01

    There is general agreement that the MiSaTaQuWa equations should describe the motion of a 'small body' in general relativity, taking into account the leading order self-force effects. However, previous derivations of these equations have made a number of ad hoc assumptions and/or contain a number of unsatisfactory features. For example, all previous derivations have invoked, without proper justification, the step of 'Lorenz gauge relaxation', wherein the linearized Einstein equation is written in the form appropriate to the Lorenz gauge, but the Lorenz gauge condition is then not imposed-thereby making the resulting equations for the metric perturbation inequivalent to the linearized Einstein equations. (Such a 'relaxation' of the linearized Einstein equations is essential in order to avoid the conclusion that 'point particles' move on geodesics.) In this paper, we analyze the issue of 'particle motion' in general relativity in a systematic and rigorous way by considering a one-parameter family of metrics, g ab (λ), corresponding to having a body (or black hole) that is 'scaled down' to zero size and mass in an appropriate manner. We prove that the limiting worldline of such a one-parameter family must be a geodesic of the background metric, g ab (λ = 0). Gravitational self-force-as well as the force due to coupling of the spin of the body to curvature-then arises as a first-order perturbative correction in λ to this worldline. No assumptions are made in our analysis apart from the smoothness and limit properties of the one-parameter family of metrics, g ab (λ). Our approach should provide a framework for systematically calculating higher order corrections to gravitational self-force, including higher multipole effects, although we do not attempt to go beyond first-order calculations here. The status of the MiSaTaQuWa equations is explained

  12. Study of a spherical torus based volumetric neutron source for nuclear technology testing and development. Final report of a scientific research supported by the USDOE/SBIR program

    International Nuclear Information System (INIS)

    Cheng, E.T.

    1999-01-01

    A plasma based, deuterium and tritium (DT) fueled, volumetric 14 MeV neutron source (VNS) has been considered as a possible facility to support the development of the demonstration fusion power reactor (DEMO). It can be used to test and develop necessary fusion blanket and divertor components and provide sufficient database, particularly on the reliability of nuclear components necessary for DEMO. The VNS device complement to ITER by reducing the cost and risk in the development of DEMO. A low cost, scientifically attractive, and technologically feasible volumetric neutron source based on the spherical torus (ST) concept has been conceived. The ST-VNS, which has a major radius of 1.07 m, aspect ratio 1.4, and plasma elongation 3, can produce a neutron wall loading from 0.5 to 5 MW/m 2 at the outboard test section with a modest fusion power level from 38 to 380 MW. It can be used to test necessary nuclear technologies for fusion power reactor and develop fusion core components include divertor, first wall, and power blanket. Using staged operation leading to high neutron wall loading and optimistic availability, a neutron fluence of more than 30 MW-y/m 2 is obtainable within 20 years of operation. This will permit the assessments of lifetime and reliability of promising fusion core components in a reactor relevant environment. A full scale demonstration of power reactor fusion core components is also made possible because of the high neutron wall loading capability. Tritium breeding in such a full scale demonstration can be very useful to ensure the self-sufficiency of fuel cycle for a candidate power blanket concept

  13. 1995 Scientific Report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-11-01

    This annual scientific report of SCK-CEN presents a comprehensive coverage and research activities in the filed of (a) waste and site restoration (b) reactor safety and radiation protection (c) operation of BR2 Materials Testing Reactor and (d) services provided by the center (analysis for characterization of waste packages, nuclear measurements, low-level radioactivity measurements).

  14. Experimental evaluation of rigor mortis. III. Comparative study of the evolution of rigor mortis in different sized muscle groups in rats.

    Science.gov (United States)

    Krompecher, T; Fryc, O

    1978-01-01

    The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.

  15. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    Science.gov (United States)

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  16. Pearce element ratios: A paradigm for testing hypotheses

    Science.gov (United States)

    Russell, J. K.; Nicholls, Jim; Stanley, Clifford R.; Pearce, T. H.

    Science moves forward with the development of new ideas that are encapsulated by hypotheses whose aim is to explain the structure of data sets or to expand existing theory. These hypotheses remain conjecture until they have been tested. In fact, Karl Popper advocated that a scientist's job does not finish with the creation of an idea but, rather, begins with the testing of the related hypotheses. In Popper's [1959] advocation it is implicit that there be tools with which we can test our hypotheses. Consequently, the development of rigorous tests for conceptual models plays a major role in maintaining the integrity of scientific endeavor [e.g., Greenwood, 1989].

  17. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  18. Rigorous upper bounds for transport due to passive advection by inhomogeneous turbulence

    International Nuclear Information System (INIS)

    Krommes, J.A.; Smith, R.A.

    1987-05-01

    A variational procedure, due originally to Howard and explored by Busse and others for self-consistent turbulence problems, is employed to determine rigorous upper bounds for the advection of a passive scalar through an inhomogeneous turbulent slab with arbitrary generalized Reynolds number R and Kubo number K. In the basic version of the method, the steady-state energy balance is used as a constraint; the resulting bound, though rigorous, is independent of K. A pedagogical reference model (one dimension, K = ∞) is described in detail; the bound compares favorably with the exact solution. The direct-interaction approximation is also worked out for this model; it is somewhat more accurate than the bound, but requires considerably more labor to solve. For the basic bound, a general formalism is presented for several dimensions, finite correlation length, and reasonably general boundary conditions. Part of the general method, in which a Green's function technique is employed, applies to self-consistent as well as to passive problems, and thereby generalizes previous results in the fluid literature. The formalism is extended for the first time to include time-dependent constraints, and a bound is deduced which explicitly depends on K and has the correct physical scalings in all regimes of R and K. Two applications from the theory of turbulent plasmas ae described: flux in velocity space, and test particle transport in stochastic magnetic fields. For the velocity space problem the simplest bound reproduces Dupree's original scaling for the strong turbulence diffusion coefficient. For the case of stochastic magnetic fields, the scaling of the bounds is described for the magnetic diffusion coefficient as well as for the particle diffusion coefficient in the so-called collisionless, fluid, and double-streaming regimes

  19. Wedding Rigorous Scientific Methodology and Ancient Herbal Wisdom to Benefit Cancer Patients: The Development of PHY906.

    Science.gov (United States)

    Chu, Edward

    2018-02-15

    Our research group has extensively characterized the preclinical and clinical activities of PHY906, a traditional Chinese herbal medicine, as a modulator of irinotecan-based chemotherapy for the treatment of colorectal cancer. This article reviews the critical issues of quality control and standardization of PHY906 and highlights the importance of high-quality material for the conduct of preclinical and clinical studies. Studies to investigate the potential biological mechanisms of action using a systems biology approach play a pivotal role in providing the preclinical rationale to move forward with clinical studies. For early-phase clinical studies, translational biomarkers should be incorporated to characterize the biological effects of the herbal medicine. These biomarkers include tumor mutational load, cytokine/chemokine expression, metabolomic profiling, and the presence of key herbal metabolites. Sophisticated bioinformatic approaches are critical for mining the data and identifying those biomarkers that can define the subset of patients who will benefit from PHY906 or any other herbal medicine, in terms of reduced treatment toxicity, improved quality of life, and/or enhanced clinical activity of treatment.

  20. Aviation Flight Test

    Data.gov (United States)

    Federal Laboratory Consortium — Redstone Test Center provides an expert workforce and technologically advanced test equipment to conduct the rigorous testing necessary for U.S. Army acquisition and...

  1. Rigorous Photogrammetric Processing of CHANG'E-1 and CHANG'E-2 Stereo Imagery for Lunar Topographic Mapping

    Science.gov (United States)

    Di, K.; Liu, Y.; Liu, B.; Peng, M.

    2012-07-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF) coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1) refining EOPs by correcting the attitude angle bias, 2) refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model) and DOM (Digital Ortho Map) are automatically generated.

  2. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    Directory of Open Access Journals (Sweden)

    K. Di

    2012-07-01

    Full Text Available Chang'E-1(CE-1 and Chang'E-2(CE-2 are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1 refining EOPs by correcting the attitude angle bias, 2 refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model and DOM (Digital Ortho Map are automatically generated.

  3. ATP, IMP, and glycogen in cod muscle at onset and during development of rigor mortis depend on the sampling location

    DEFF Research Database (Denmark)

    Cappeln, Gertrud; Jessen, Flemming

    2002-01-01

    Variation in glycogen, ATP, and IMP contents within individual cod muscles were studied in ice stored fish during the progress of rigor mortis. Rigor index was determined before muscle samples for chemical analyzes were taken at 16 different positions on the fish. During development of rigor......, the contents of glycogen and ATP decreased differently in relation to rigor index depending on sampling location. Although fish were considered to be in strong rigor according to the rigor index method, parts of the muscle were not in rigor as high ATP concentrations were found in dorsal and tall muscle....

  4. Trends in Methodological Rigor in Intervention Research Published in School Psychology Journals

    Science.gov (United States)

    Burns, Matthew K.; Klingbeil, David A.; Ysseldyke, James E.; Petersen-Brown, Shawna

    2012-01-01

    Methodological rigor in intervention research is important for documenting evidence-based practices and has been a recent focus in legislation, including the No Child Left Behind Act. The current study examined the methodological rigor of intervention research in four school psychology journals since the 1960s. Intervention research has increased…

  5. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    Science.gov (United States)

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (prigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  6. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    Science.gov (United States)

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (psalting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  7. Rigorous bounds on the free energy of electron-phonon models

    NARCIS (Netherlands)

    Raedt, Hans De; Michielsen, Kristel

    1997-01-01

    We present a collection of rigorous upper and lower bounds to the free energy of electron-phonon models with linear electron-phonon interaction. These bounds are used to compare different variational approaches. It is shown rigorously that the ground states corresponding to the sharpest bounds do

  8. The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools

    Science.gov (United States)

    Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia

    2016-01-01

    Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…

  9. Moving beyond Data Transcription: Rigor as Issue in Representation of Digital Literacies

    Science.gov (United States)

    Hagood, Margaret Carmody; Skinner, Emily Neil

    2015-01-01

    Rigor in qualitative research has been based upon criteria of credibility, dependability, confirmability, and transferability. Drawing upon articles published during our editorship of the "Journal of Adolescent & Adult Literacy," we illustrate how the use of digital data in research study reporting may enhance these areas of rigor,…

  10. A new look at the statistical assessment of approximate and rigorous methods for the estimation of stabilized formation temperatures in geothermal and petroleum wells

    International Nuclear Information System (INIS)

    Espinoza-Ojeda, O M; Santoyo, E; Andaverde, J

    2011-01-01

    Approximate and rigorous solutions of seven heat transfer models were statistically examined, for the first time, to estimate stabilized formation temperatures (SFT) of geothermal and petroleum boreholes. Constant linear and cylindrical heat source models were used to describe the heat flow (either conductive or conductive/convective) involved during a borehole drilling. A comprehensive statistical assessment of the major error sources associated with the use of these models was carried out. The mathematical methods (based on approximate and rigorous solutions of heat transfer models) were thoroughly examined by using four statistical analyses: (i) the use of linear and quadratic regression models to infer the SFT; (ii) the application of statistical tests of linearity to evaluate the actual relationship between bottom-hole temperatures and time function data for each selected method; (iii) the comparative analysis of SFT estimates between the approximate and rigorous predictions of each analytical method using a β ratio parameter to evaluate the similarity of both solutions, and (iv) the evaluation of accuracy in each method using statistical tests of significance, and deviation percentages between 'true' formation temperatures and SFT estimates (predicted from approximate and rigorous solutions). The present study also enabled us to determine the sensitivity parameters that should be considered for a reliable calculation of SFT, as well as to define the main physical and mathematical constraints where the approximate and rigorous methods could provide consistent SFT estimates

  11. Onset of rigor mortis is earlier in red muscle than in white muscle.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H

    2000-01-01

    Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.

  12. High and low rigor temperature effects on sheep meat tenderness and ageing.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (Prigor at each ageing time were significantly different (Prigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  13. Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL

    Science.gov (United States)

    Jenkins, J. Steven; Rouquette, Nicolas F.

    2012-01-01

    The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.

  14. Scientific progress report 1980

    International Nuclear Information System (INIS)

    1981-01-01

    The R + D-projects in this field and the infrastructural tasks mentioned are handled in seven working- and two project groups: Computer systems, Numerical and applied mathematics, Software development, Process calculation systems- hardware, Nuclear electronics, measuring- and automatic control technique, Research of component parts and irradiation tests, Central data processing, Processing of process data in the science of medicine, Co-operation in the BERNET-project in the 'Wissenschaftliches Rechenzentrum Berlin (WRB)' (scientific computer center in Berlin). (orig./WB)

  15. Criteria and tools for scientific software quality measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tseng, M Y [Previse Inc., Willowdale ON (Canada)

    1995-12-01

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs.

  16. Criteria and tools for scientific software quality measurements

    International Nuclear Information System (INIS)

    Tseng, M.Y.

    1995-12-01

    Not all software used in the nuclear industry needs the rigorous formal verification, reliability testing and quality assessment that are being applied to safety critical software. Recently, however, there is increasing recognition that systematic and objective quality assessment of the scientific software used in design and safety analyses of nuclear facilities is necessary to support safety and licensing decisions. Because of the complexity and large size of these programs and the resource constraints faced by the AECB reviewer, it is desirable that appropriate automated tools are used wherever practical. To objectively assess the quality of software, a set of attributes of a software product by which its quality is described and evaluated must be established. These attributes must be relevant to the application domain of software under evaluation. To effectively assess the quality of software, metrics defining quantitative scale and method appropriate to determine the value of attributes need to be applied. To cost-effectively perform the evaluation, use of suitable automated tools is desirable. In this project, criteria for evaluating the quality of scientific software are presented; metrics for which those criteria can be evaluated are identified; a survey of automated tools to measure those metrics was conducted and the most appropriate tool (QA Fortran) was acquired; and the tool usage was demonstrated on three sample programs. (author) 5 refs

  17. Computation of Quasiperiodic Normally Hyperbolic Invariant Tori: Rigorous Results

    Science.gov (United States)

    Canadell, Marta; Haro, Àlex

    2017-12-01

    The development of efficient methods for detecting quasiperiodic oscillations and computing the corresponding invariant tori is a subject of great importance in dynamical systems and their applications in science and engineering. In this paper, we prove the convergence of a new Newton-like method for computing quasiperiodic normally hyperbolic invariant tori carrying quasiperiodic motion in smooth families of real-analytic dynamical systems. The main result is stated as an a posteriori KAM-like theorem that allows controlling the inner dynamics on the torus with appropriate detuning parameters, in order to obtain a prescribed quasiperiodic motion. The Newton-like method leads to several fast and efficient computational algorithms, which are discussed and tested in a companion paper (Canadell and Haro in J Nonlinear Sci, 2017. doi: 10.1007/s00332-017-9388-z), in which new mechanisms of breakdown are presented.

  18. Differential rigor development in red and white muscle revealed by simultaneous measurement of tension and stiffness.

    Science.gov (United States)

    Kobayashi, Masahiko; Takemori, Shigeru; Yamaguchi, Maki

    2004-02-10

    Based on the molecular mechanism of rigor mortis, we have proposed that stiffness (elastic modulus evaluated with tension response against minute length perturbations) can be a suitable index of post-mortem rigidity in skeletal muscle. To trace the developmental process of rigor mortis, we measured stiffness and tension in both red and white rat skeletal muscle kept in liquid paraffin at 37 and 25 degrees C. White muscle (in which type IIB fibres predominate) developed stiffness and tension significantly more slowly than red muscle, except for soleus red muscle at 25 degrees C, which showed disproportionately slow rigor development. In each of the examined muscles, stiffness and tension developed more slowly at 25 degrees C than at 37 degrees C. In each specimen, tension always reached its maximum level earlier than stiffness, and then decreased more rapidly and markedly than stiffness. These phenomena may account for the sequential progress of rigor mortis in human cadavers.

  19. Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).

    Science.gov (United States)

    Suzutani, T; Ishibashi, H; Takatori, T

    1978-11-01

    The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.

  20. 75 FR 29732 - Career and Technical Education Program-Promoting Rigorous Career and Technical Education Programs...

    Science.gov (United States)

    2010-05-27

    ... rigorous knowledge and skills in English- language arts and mathematics that employers and colleges expect... specialists and to access the student outcome data needed to meet annual evaluation and reporting requirements...

  1. Rigorous derivation from Landau-de Gennes theory to Ericksen-Leslie theory

    OpenAIRE

    Wang, Wei; Zhang, Pingwen; Zhang, Zhifei

    2013-01-01

    Starting from Beris-Edwards system for the liquid crystal, we present a rigorous derivation of Ericksen-Leslie system with general Ericksen stress and Leslie stress by using the Hilbert expansion method.

  2. Realism, instrumentalism, and scientific symbiosis: psychological theory as a search for truth and the discovery of solutions.

    Science.gov (United States)

    Cacioppo, John T; Semin, Gün R; Berntson, Gary G

    2004-01-01

    Scientific realism holds that scientific theories are approximations of universal truths about reality, whereas scientific instrumentalism posits that scientific theories are intellectual structures that provide adequate predictions of what is observed and useful frameworks for answering questions and solving problems in a given domain. These philosophical perspectives have different strengths and weaknesses and have been regarded as incommensurate: Scientific realism fosters theoretical rigor, verifiability, parsimony, and debate, whereas scientific instrumentalism fosters theoretical innovation, synthesis, generativeness, and scope. The authors review the evolution of scientific realism and instrumentalism in psychology and propose that the categorical distinction between the 2 is overstated as a prescription for scientific practice. The authors propose that the iterative deployment of these 2 perspectives, just as the iterative application of inductive and deductive reasoning in science, may promote more rigorous, integrative, cumulative, and useful scientific theories.

  3. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  4. Lyme disease: a rigorous review of diagnostic criteria and treatment.

    Science.gov (United States)

    Borchers, Andrea T; Keen, Carl L; Huntley, Arthur C; Gershwin, M Eric

    2015-02-01

    Lyme disease was originally identified in Lyme, Connecticut, based upon an unusual cluster of what appeared to be patients with juvenile rheumatoid arthritis. It was subsequently identified as a new clinical entity originally called Lyme arthritis based on the observation that arthritis was a major clinical feature. However, Lyme arthritis is now called Lyme disease based upon the understanding that the clinical features include not only arthritis, but also potential cardiac, dermatologic and neurologic findings. Lyme disease typically begins with an erythematous rash called erythema migrans (EM). Approximately 4-8% of patients develop cardiac, 11% develop neurologic and 45-60% of patients manifest arthritis. The disease is transmitted following exposure to a tick bite containing a spirochete in a genetically susceptible host. There is considerable data on spirochetes, including Borrelia burgdorferi (Bb), the original bacteria identified in this disease. Lyme disease, if an organism had not been identified, would be considered as a classic autoimmune disease and indeed the effector mechanisms are similar to many human diseases manifest as loss of tolerance. The clinical diagnosis is highly likely based upon appropriate serology and clinical manifestations. However, the serologic features are often misinterpreted and may have false positives if confirmatory laboratory testing is not performed. Antibiotics are routinely and typically used to treat patients with Lyme disease, but there is no evidence that prolonged or recurrent treatment with antibiotics change the natural history of Lyme disease. Although there are animal models of Lyme disease, there is no system that faithfully recapitulates the human disease. Further research on the effector mechanisms that lead to pathology in some individuals should be further explored to develop more specific therapy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Total integrated dose testing of solid-state scientific CD4011, CD4013, and CD4060 devices by irradiation with CO-60 gamma rays

    Science.gov (United States)

    Dantas, A. R. V.; Gauthier, M. K.; Coss, J. R.

    1985-01-01

    The total integrated dose response of three CMOS devices manufactured by Solid State Scientific has been measured using CO-60 gamma rays. Key parameter measurements were made and compared for each device type. The data show that the CD4011, CD4013, and CD4060 produced by this manufacturers should not be used in any environments where radiation levels might exceed 1,000 rad(Si).

  6. Rigorous Training of Dogs Leads to High Accuracy in Human Scent Matching-To-Sample Performance.

    Directory of Open Access Journals (Sweden)

    Sophie Marchal

    Full Text Available Human scent identification is based on a matching-to-sample task in which trained dogs are required to compare a scent sample collected from an object found at a crime scene to that of a suspect. Based on dogs' greater olfactory ability to detect and process odours, this method has been used in forensic investigations to identify the odour of a suspect at a crime scene. The excellent reliability and reproducibility of the method largely depend on rigor in dog training. The present study describes the various steps of training that lead to high sensitivity scores, with dogs matching samples with 90% efficiency when the complexity of the scents presented during the task in the sample is similar to that presented in the in lineups, and specificity reaching a ceiling, with no false alarms in human scent matching-to-sample tasks. This high level of accuracy ensures reliable results in judicial human scent identification tests. Also, our data should convince law enforcement authorities to use these results as official forensic evidence when dogs are trained appropriately.

  7. Study designs for identifying risk compensation behavior among users of biomedical HIV prevention technologies: balancing methodological rigor and research ethics.

    Science.gov (United States)

    Underhill, Kristen

    2013-10-01

    The growing evidence base for biomedical HIV prevention interventions - such as oral pre-exposure prophylaxis, microbicides, male circumcision, treatment as prevention, and eventually prevention vaccines - has given rise to concerns about the ways in which users of these biomedical products may adjust their HIV risk behaviors based on the perception that they are prevented from infection. Known as risk compensation, this behavioral adjustment draws on the theory of "risk homeostasis," which has previously been applied to phenomena as diverse as Lyme disease vaccination, insurance mandates, and automobile safety. Little rigorous evidence exists to answer risk compensation concerns in the biomedical HIV prevention literature, in part because the field has not systematically evaluated the study designs available for testing these behaviors. The goals of this Commentary are to explain the origins of risk compensation behavior in risk homeostasis theory, to reframe risk compensation as a testable response to the perception of reduced risk, and to assess the methodological rigor and ethical justification of study designs aiming to isolate risk compensation responses. Although the most rigorous methodological designs for assessing risk compensation behavior may be unavailable due to ethical flaws, several strategies can help investigators identify potential risk compensation behavior during Phase II, Phase III, and Phase IV testing of new technologies. Where concerns arise regarding risk compensation behavior, empirical evidence about the incidence, types, and extent of these behavioral changes can illuminate opportunities to better support the users of new HIV prevention strategies. This Commentary concludes by suggesting a new way to conceptualize risk compensation behavior in the HIV prevention context. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications

    Directory of Open Access Journals (Sweden)

    Vassilis Gikas

    2016-08-01

    Full Text Available With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS, Inertial Measurement Unit (IMU and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness capabilities (i.e., potential and limitations based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android and iPhone 5s (iOS. Our findings indicate that the deviation of the smartphone locations from ground truth (trueness deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of

  9. Annual scientific report 1974

    International Nuclear Information System (INIS)

    Billiau, R.; Bobin, K.; Michiels, G.; Proost, J.

    1975-01-01

    The main activities of SCK/CEN during 1974 are reported in individual summaries. Fields of research are the following: sodium cooled fast reactors, gas cooled reactors, light water reactors, applied nuclear research (including waste disposal, safeguards and fusion research), basic and exploratory research (including materials science, nuclear physics and radiobiology). The BR2 Materials testing reactor and associated facilities are described. The technical and administrative support activities are also presented. A list of publications issued by the SCK/CEN Scientific staff is given

  10. Annual scientific report 1975

    International Nuclear Information System (INIS)

    Billiau, R.; Bobin, K.; Michiels, G.; Proost, J.

    1976-01-01

    The main activities of SCK/CEN during 1975 are reported in individual summaries. Field of research are the following: sodium cooled fast reactors, gas cooled reactors, light water reactors, applied nuclear research (including waste disposal, safeguards and fusion research), basic and exploratory research (including materials science, nuclear physics and radiobiology). The BR2 Materials testing reactor and associated facilities are described. The technical and administrative support activities are also presented. A list of publications issued by the SCK/CEN Scientific staff is given

  11. A Qualitative Study of Anticipated Decision Making around Type 2 Diabetes Genetic Testing: the Role of Scientifically Concordant and Discordant Expectations.

    Science.gov (United States)

    Carmichael, Alicia G; Hulswit, Bailey B; Moe, Emily J; Jayaratne, Toby Epstein; Yashar, Beverly M

    2017-06-01

    Type 2 diabetes mellitus (T2DM) genetic testing is undergoing clinical trials to measure the efficacy of genetic counseling for behavior-based risk reduction. The expectations patients bring to the testing process may play an important role in individual outcomes. We conducted a qualitative exploration of anticipated decision-making and expectations around T2DM genetic testing. Semi-structured interviews were completed with Mexican Americans (n = 34), non-Hispanic Black Americans (n = 39), and non-Hispanic White Americans (n = 39) at risk for T2DM. Transcripts were analyzed for themes. Most participants would accept T2DM genetic testing in order to motivate risk-reducing behaviors or apprise family members of their risk. Participants who would decline testing wished to avoid emotional distress or believed the test would not reveal new risk information. Non-Hispanic Whites and those with college education declined genetic testing more often than other groups. Those without college education were more likely to have testing expectations that were discordant with current science, such as conflating genetic testing with common 'blood tests.' Understanding expectations and decision-making factors around T2DM genetic testing will better prepare healthcare professionals to counsel their patients. This may lead to a higher efficacy of T2DM genetic testing and counseling.

  12. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    Science.gov (United States)

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  13. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    Science.gov (United States)

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Scientific instruments, scientific progress and the cyclotron

    International Nuclear Information System (INIS)

    Baird, David; Faust, Thomas

    1990-01-01

    Philosophers speak of science in terms of theory and experiment, yet when they speak of the progress of scientific knowledge they speak in terms of theory alone. In this article it is claimed that scientific knowledge consists of, among other things, scientific instruments and instrumental techniques and not simply of some kind of justified beliefs. It is argued that one aspect of scientific progress can be characterized relatively straightforwardly - the accumulation of new scientific instruments. The development of the cyclotron is taken to illustrate this point. Eight different activities which promoted the successful completion of the cyclotron are recognised. The importance is in the machine rather than the experiments which could be run on it and the focus is on how the cyclotron came into being, not how it was subsequently used. The completed instrument is seen as a useful unit of scientific progress in its own right. (UK)

  15. The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.

    Science.gov (United States)

    Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko

    2013-11-01

    Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Singularity hypotheses a scientific and philosophical assessment

    CERN Document Server

    Moor, James; Søraker, Johnny; Steinhart, Eric

    2012-01-01

    Singularity Hypotheses: A Scientific and Philosophical Assessment offers authoritative, jargon-free essays and critical commentaries on accelerating technological progress and the notion of technological singularity. It focuses on conjectures about the intelligence explosion, transhumanism, and whole brain emulation. Recent years have seen a plethora of forecasts about the profound, disruptive impact that is likely to result from further progress in these areas. Many commentators however doubt the scientific rigor of these forecasts, rejecting them as speculative and unfounded. We therefore invited prominent computer scientists, physicists, philosophers, biologists, economists and other thinkers to assess the singularity hypotheses. Their contributions go beyond speculation, providing deep insights into the main issues and a balanced picture of the debate.

  17. Quality properties of pre- and post-rigor beef muscle after interventions with high frequency ultrasound.

    Science.gov (United States)

    Sikes, Anita L; Mawson, Raymond; Stark, Janet; Warner, Robyn

    2014-11-01

    The delivery of a consistent quality product to the consumer is vitally important for the food industry. The aim of this study was to investigate the potential for using high frequency ultrasound applied to pre- and post-rigor beef muscle on the metabolism and subsequent quality. High frequency ultrasound (600kHz at 48kPa and 65kPa acoustic pressure) applied to post-rigor beef striploin steaks resulted in no significant effect on the texture (peak force value) of cooked steaks as measured by a Tenderometer. There was no added benefit of ultrasound treatment above that of the normal ageing process after ageing of the steaks for 7days at 4°C. Ultrasound treatment of post-rigor beef steaks resulted in a darkening of fresh steaks but after ageing for 7days at 4°C, the ultrasound-treated steaks were similar in colour to that of the aged, untreated steaks. High frequency ultrasound (2MHz at 48kPa acoustic pressure) applied to pre-rigor beef neck muscle had no effect on the pH, but the calculated exhaustion factor suggested that there was some effect on metabolism and actin-myosin interaction. However, the resultant texture of cooked, ultrasound-treated muscle was lower in tenderness compared to the control sample. After ageing for 3weeks at 0°C, the ultrasound-treated samples had the same peak force value as the control. High frequency ultrasound had no significant effect on the colour parameters of pre-rigor beef neck muscle. This proof-of-concept study showed no effect of ultrasound on quality but did indicate that the application of high frequency ultrasound to pre-rigor beef muscle shows potential for modifying ATP turnover and further investigation is warranted. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  18. Disciplining Bioethics: Towards a Standard of Methodological Rigor in Bioethics Research

    Science.gov (United States)

    Adler, Daniel; Shaul, Randi Zlotnik

    2012-01-01

    Contemporary bioethics research is often described as multi- or interdisciplinary. Disciplines are characterized, in part, by their methods. Thus, when bioethics research draws on a variety of methods, it crosses disciplinary boundaries. Yet each discipline has its own standard of rigor—so when multiple disciplinary perspectives are considered, what constitutes rigor? This question has received inadequate attention, as there is considerable disagreement regarding the disciplinary status of bioethics. This disagreement has presented five challenges to bioethics research. Addressing them requires consideration of the main types of cross-disciplinary research, and consideration of proposals aiming to ensure rigor in bioethics research. PMID:22686634

  19. Mathematical framework for fast and rigorous track fit for the ZEUS detector

    Energy Technology Data Exchange (ETDEWEB)

    Spiridonov, Alexander

    2008-12-15

    In this note we present a mathematical framework for a rigorous approach to a common track fit for trackers located in the inner region of the ZEUS detector. The approach makes use of the Kalman filter and offers a rigorous treatment of magnetic field inhomogeneity, multiple scattering and energy loss. We describe mathematical details of the implementation of the Kalman filter technique with a reduced amount of computations for a cylindrical drift chamber, barrel and forward silicon strip detectors and a forward straw drift chamber. Options with homogeneous and inhomogeneous field are discussed. The fitting of tracks in one ZEUS event takes about of 20ms on standard PC. (orig.)

  20. Electrocardiogram artifact caused by rigors mimicking narrow complex tachycardia: a case report.

    Science.gov (United States)

    Matthias, Anne Thushara; Indrakumar, Jegarajah

    2014-02-04

    The electrocardiogram (ECG) is useful in the diagnosis of cardiac and non-cardiac conditions. Rigors due to shivering can cause electrocardiogram artifacts mimicking various cardiac rhythm abnormalities. We describe an 80-year-old Sri Lankan man with an abnormal electrocardiogram mimicking narrow complex tachycardia during the immediate post-operative period. Electrocardiogram changes caused by muscle tremor during rigors could mimic a narrow complex tachycardia. Identification of muscle tremor as a cause of electrocardiogram artifact can avoid unnecessary pharmacological and non-pharmacological intervention to prevent arrhythmias.

  1. Self-testing for contact sensitization to hair dyes--scientific considerations and clinical concerns of an industry-led screening programme

    DEFF Research Database (Denmark)

    Thyssen, Jacob P; Søsted, Heidi; Uter, Wolfgang

    2012-01-01

    The cosmetic industry producing hair dyes has, for many years, recommended that their consumers perform 'a hair dye allergy self-test' or similar prior to hair dyeing, to identify individuals who are likely to react upon subsequent hair dyeing. This review offers important information...... on the requirements for correct validation of screening tests, and concludes that, in its present form, the hair dye self-test has severe limitations: (i) it is not a screening test but a diagnostic test; (ii) it has not been validated according to basic criteria defined by scientists; (iii) it has been evaluated...... in the wrong population group; (iv) skin reactions have been read by dermatologists and not by the targeted group (consumers and hairdressers); (v) hair dyes contain strong and extreme sensitizers that are left on the skin in high concentrations, potentially resulting in active sensitization; and (vi...

  2. Role of human neurobehavioural tests in regulatory activity on chemicals

    Science.gov (United States)

    Stephens, R.; Barker, P.

    1998-01-01

    Psychological performance tests have been used since the mid-1960s in occupational and environmental health toxicology. The interpretation of significantly different test scores in neurobehavioural studies is not straightforward in the regulation of chemicals. This paper sets out some issues which emerged from discussions at an international workshop, organised by the United Kingdom Health and Safety Executive (HSE), to discuss differences in interpretation of human neurobehavioural test data in regulatory risk assessments. The difficulties encountered by regulators confronted with neurobehavioural studies seem to be twofold; some studies lack scientific rigor; other studies, although scientifically sound, are problematic because it is not clear what interpretation to place on the results. Issues relating to each of these points are discussed. Next, scenarios within which to consider the outcomes of neurobehavioural studies are presented. Finally, conclusions and recommendations for further work are put forward.   PMID:9624273

  3. Experimental evaluation of rigor mortis. VIII. Estimation of time since death by repeated measurements of the intensity of rigor mortis on rats.

    Science.gov (United States)

    Krompecher, T

    1994-10-21

    The development of the intensity of rigor mortis was monitored in nine groups of rats. The measurements were initiated after 2, 4, 5, 6, 8, 12, 15, 24, and 48 h post mortem (p.m.) and lasted 5-9 h, which ideally should correspond to the usual procedure after the discovery of a corpse. The experiments were carried out at an ambient temperature of 24 degrees C. Measurements initiated early after death resulted in curves with a rising portion, a plateau, and a descending slope. Delaying the initial measurement translated into shorter rising portions, and curves initiated 8 h p.m. or later are comprised of a plateau and/or a downward slope only. Three different phases were observed suggesting simple rules that can help estimate the time since death: (1) if an increase in intensity was found, the initial measurements were conducted not later than 5 h p.m.; (2) if only a decrease in intensity was observed, the initial measurements were conducted not earlier than 7 h p.m.; and (3) at 24 h p.m., the resolution is complete, and no further changes in intensity should occur. Our results clearly demonstrate that repeated measurements of the intensity of rigor mortis allow a more accurate estimation of the time since death of the experimental animals than the single measurement method used earlier. A critical review of the literature on the estimation of time since death on the basis of objective measurements of the intensity of rigor mortis is also presented.

  4. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    Science.gov (United States)

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  5. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    Science.gov (United States)

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  6. Rigorous lower bound on the dynamic critical exponent of some multilevel Swendsen-Wang algorithms

    International Nuclear Information System (INIS)

    Li, X.; Sokal, A.D.

    1991-01-01

    We prove the rigorous lower bound z exp ≥α/ν for the dynamic critical exponent of a broad class of multilevel (or ''multigrid'') variants of the Swendsen-Wang algorithm. This proves that such algorithms do suffer from critical slowing down. We conjecture that such algorithms in fact lie in the same dynamic universality class as the stanard Swendsen-Wang algorithm

  7. Rigorous modelling of light's intensity angular-profile in Abbe refractometers with absorbing homogeneous fluids

    International Nuclear Information System (INIS)

    García-Valenzuela, A; Contreras-Tello, H; Márquez-Islas, R; Sánchez-Pérez, C

    2013-01-01

    We derive an optical model for the light intensity distribution around the critical angle in a standard Abbe refractometer when used on absorbing homogenous fluids. The model is developed using rigorous electromagnetic optics. The obtained formula is very simple and can be used suitably in the analysis and design of optical sensors relying on Abbe type refractometry.

  8. Rigorous approximation of stationary measures and convergence to equilibrium for iterated function systems

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Monge, Maurizio; Nisoli, Isaia

    2016-01-01

    We study the problem of the rigorous computation of the stationary measure and of the rate of convergence to equilibrium of an iterated function system described by a stochastic mixture of two or more dynamical systems that are either all uniformly expanding on the interval, either all contracting. In the expanding case, the associated transfer operators satisfy a Lasota–Yorke inequality, we show how to compute a rigorous approximations of the stationary measure in the L "1 norm and an estimate for the rate of convergence. The rigorous computation requires a computer-aided proof of the contraction of the transfer operators for the maps, and we show that this property propagates to the transfer operators of the IFS. In the contracting case we perform a rigorous approximation of the stationary measure in the Wasserstein–Kantorovich distance and rate of convergence, using the same functional analytic approach. We show that a finite computation can produce a realistic computation of all contraction rates for the whole parameter space. We conclude with a description of the implementation and numerical experiments. (paper)

  9. Double phosphorylation of the myosin regulatory light chain during rigor mortis of bovine Longissimus muscle.

    Science.gov (United States)

    Muroya, Susumu; Ohnishi-Kameyama, Mayumi; Oe, Mika; Nakajima, Ikuyo; Shibata, Masahiro; Chikuni, Koichi

    2007-05-16

    To investigate changes in myosin light chains (MyLCs) during postmortem aging of the bovine longissimus muscle, we performed two-dimensional gel electrophoresis followed by identification with matrix-assisted laser desorption ionization time-of-flight mass spectrometry. The results of fluorescent differential gel electrophoresis showed that two spots of the myosin regulatory light chain (MyLC2) at pI values of 4.6 and 4.7 shifted toward those at pI values of 4.5 and 4.6, respectively, by 24 h postmortem when rigor mortis was completed. Meanwhile, the MyLC1 and MyLC3 spots did not change during the 14 days postmortem. Phosphoprotein-specific staining of the gels demonstrated that the MyLC2 proteins at pI values of 4.5 and 4.6 were phosphorylated. Furthermore, possible N-terminal region peptides containing one and two phosphoserine residues were detected in each mass spectrum of the MyLC2 spots at pI values of 4.5 and 4.6, respectively. These results demonstrated that MyLC2 became doubly phosphorylated during rigor formation of the bovine longissimus, suggesting involvement of the MyLC2 phosphorylation in the progress of beef rigor mortis. Bovine; myosin regulatory light chain (RLC, MyLC2); phosphorylation; rigor mortis; skeletal muscle.

  10. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    Energy Technology Data Exchange (ETDEWEB)

    Botelho, Luiz C.L. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Matematica Aplicada]. E-mail: botelho.luiz@superig.com.br

    2008-07-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R{sup {infinity}}, we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  11. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    International Nuclear Information System (INIS)

    Botelho, Luiz C.L.

    2008-01-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R ∞ , we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  12. Community historians and the dilemma of rigor vs relevance : A comment on Danziger and van Rappard

    NARCIS (Netherlands)

    Dehue, Trudy

    1998-01-01

    Since the transition from finalism to contextualism, the history of science seems to be caught up in a basic dilemma. Many historians fear that with the new contextualist standards of rigorous historiography, historical research can no longer be relevant to working scientists themselves. The present

  13. A plea for rigorous conceptual analysis as central method in transnational law design

    NARCIS (Netherlands)

    Rijgersberg, R.; van der Kaaij, H.

    2013-01-01

    Although shared problems are generally easily identified in transnational law design, it is considerably more difficult to design frameworks that transcend the peculiarities of local law in a univocal fashion. The following exposition is a plea for giving more prominence to rigorous conceptual

  14. College Readiness in California: A Look at Rigorous High School Course-Taking

    Science.gov (United States)

    Gao, Niu

    2016-01-01

    Recognizing the educational and economic benefits of a college degree, education policymakers at the federal, state, and local levels have made college preparation a priority. There are many ways to measure college readiness, but one key component is rigorous high school coursework. California has not yet adopted a statewide college readiness…

  15. Scientific integrity in Brazil.

    Science.gov (United States)

    Lins, Liliane; Carvalho, Fernando Martins

    2014-09-01

    This article focuses on scientific integrity and the identification of predisposing factors to scientific misconduct in Brazil. Brazilian scientific production has increased in the last ten years, but the quality of the articles has decreased. Pressure on researchers and students for increasing scientific production may contribute to scientific misconduct. Cases of misconduct in science have been recently denounced in the country. Brazil has important institutions for controlling ethical and safety aspects of human research, but there is a lack of specific offices to investigate suspected cases of misconduct and policies to deal with scientific dishonesty.

  16. Evaluation of physical dimension changes as nondestructive measurements for monitoring rigor mortis development in broiler muscles.

    Science.gov (United States)

    Cavitt, L C; Sams, A R

    2003-07-01

    Studies were conducted to develop a non-destructive method for monitoring the rate of rigor mortis development in poultry and to evaluate the effectiveness of electrical stimulation (ES). In the first study, 36 male broilers in each of two trials were processed at 7 wk of age. After being bled, half of the birds received electrical stimulation (400 to 450 V, 400 to 450 mA, for seven pulses of 2 s on and 1 s off), and the other half were designated as controls. At 0.25 and 1.5 h postmortem (PM), carcasses were evaluated for the angles of the shoulder, elbow, and wing tip and the distance between the elbows. Breast fillets were harvested at 1.5 h PM (after chilling) from all carcasses. Fillet samples were excised and frozen for later measurement of pH and R-value, and the remainder of each fillet was held on ice until 24 h postmortem. Shear value and pH means were significantly lower, but R-value means were higher (P rigor mortis by ES. The physical dimensions of the shoulder and elbow changed (P rigor mortis development and with ES. These results indicate that physical measurements of the wings maybe useful as a nondestructive indicator of rigor development and for monitoring the effectiveness of ES. In the second study, 60 male broilers in each of two trials were processed at 7 wk of age. At 0.25, 1.5, 3.0, and 6.0 h PM, carcasses were evaluated for the distance between the elbows. At each time point, breast fillets were harvested from each carcass. Fillet samples were excised and frozen for later measurement of pH and sacromere length, whereas the remainder of each fillet was held on ice until 24 h PM. Shear value and pH means (P rigor mortis development. Elbow distance decreased (P rigor development and was correlated (P rigor mortis development in broiler carcasses.

  17. Access to the scientific literature

    Science.gov (United States)

    Albarède, Francis

    The Public Library of Science Open Letter (http://www.publiclibraryofscience.org) is a very generous initiative, but, as most similar initiatives since the advent of electronic publishing, it misses the critical aspects of electronic publishing.Ten years ago, a Publisher would be in charge of running a system called a “scientific journal.” In such a system, the presence of an Editor and peer Reviewers secures the strength of the science and the rigor of writing; the Publisher guarantees the professional quality of printing, efficient dissemination, and long-term archiving. Publishing used to be in everyone's best interest, or nearly everyone. The Publisher, because he/she is financially motivated, ensures widespread dissemination of the journal amongst libraries and individual subscribers. The interest of the Author is that the system guarantees a broad potential readership. The interest of the Reader is that a line is drawn between professionally edited literature, presumably of better quality, and gray literature or home publishing, so that he/she does not waste time going through ‘low yield’ ungraded information. The Publisher could either be a private company, an academic institution, or a scholarly society. My experience is that, when page charges and subscription rates are compounded, journals published by scholarly societies are not necessarily cheaper. The difference between these cases is not the cost of running an office with rents, wages, printing, postage, advertisement, and archiving, but that a private Publisher pays shareholders. Shareholders have the bad habit of minding their own business and, therefore, they may interfere negatively with scientific publishing. Nevertheless, while the stranglehold imposed by private Publishers on our libraries over the last 10 years by increasing subscription rates may in part be due to shareholders' greed, this is true only in part. The increases are also a consequence of the booming number of pages being

  18. The Red-Attractiveness Effect, Applying the Ioannidis and Trikalinos (2007b) Test, and the Broader Scientific Context: A Reply to Francis (2013)

    Science.gov (United States)

    Elliot, Andrew J.; Maier, Markus A.

    2013-01-01

    Francis (2013) tested for and found evidence of publication bias in 1 of the 3 focal relations examined in Elliot et al. (2010), that between red and attractiveness. He then called into question the research as a whole and the field of experimental psychology more generally. Our reply has 3 foci. First, we attend to the bottom line regarding the…

  19. The Scientific Enterprise

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 13; Issue 9. The Scientific Enterprise - Assumptions, Problems, and Goals in the Modern Scientific Framework. V V Raman. Reflections Volume 13 Issue 9 September 2008 pp 885-894 ...

  20. Extensional scientific realism vs. intensional scientific realism.

    Science.gov (United States)

    Park, Seungbae

    2016-10-01

    Extensional scientific realism is the view that each believable scientific theory is supported by the unique first-order evidence for it and that if we want to believe that it is true, we should rely on its unique first-order evidence. In contrast, intensional scientific realism is the view that all believable scientific theories have a common feature and that we should rely on it to determine whether a theory is believable or not. Fitzpatrick argues that extensional realism is immune, while intensional realism is not, to the pessimistic induction. I reply that if extensional realism overcomes the pessimistic induction at all, that is because it implicitly relies on the theoretical resource of intensional realism. I also argue that extensional realism, by nature, cannot embed a criterion for distinguishing between believable and unbelievable theories. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. [On the evolution of scientific thought].

    Science.gov (United States)

    de Micheli, Alfredo; Iturralde Torres, Pedro

    2015-01-01

    The Nominalists of the XIV century, precursors of modern science, thought that science's object was not the general, vague and indeterminate but the particular, which is real and can be known directly. About the middle of the XVII Century the bases of the modern science became established thanks to a revolution fomented essentially by Galileo, Bacon and Descartes. During the XVIII Century, parallel to the development of the great current of English Empiricism, a movement of scientific renewal also arose in continental Europe following the discipline of the Dutch Physicians and of Boerhaave. In the XIX Century, Claude Bernard dominated the scientific medicine but his rigorous determinism impeded him from taking into account the immense and unforeseeable field of the random. Nowadays, we approach natural science and medicine, from particular groups of facts; that is, from the responses of Nature to specific questions, but not from the general laws. Furthermore, in recent epistemology, the concept that experimental data are not pure facts, but rather, facts interpreted within a hermeneutical context has been established. Finally a general tendency to retrieve philosophical questions concerning the understanding of essence and existence can frequently be seen in scientific inquiry. In the light of the evolution of medical thought, it is possible to establish the position of scientific medicine within the movement of ideas dominating in our time. Copyright © 2014 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.

  2. [A new formula for the measurement of rigor mortis: the determination of the FRR-index (author's transl)].

    Science.gov (United States)

    Forster, B; Ropohl, D; Raule, P

    1977-07-05

    The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.

  3. Proposed structure of a data paper structure as scientific publication

    Directory of Open Access Journals (Sweden)

    Sandra M. Roa-Martínez

    2017-03-01

    Full Text Available This paper presents a review of the main motivations and paths for publishing datasets that are generated and managed during the research process. The Data Paper is considered as a form of scientific publication with the same recognition, acceptance and scientific rigor as conventional research articles. Therefore we propose a common structure defined by elements based mainly on dataset metadata. This will enable creators, publishers, consumers and expert peer reviewers to recognise, share, evaluate and facilitate data reuse. Doing so will facilitate information reproducibility, validation of results, and rapid new research generation.

  4. WWW: The Scientific Method

    Science.gov (United States)

    Blystone, Robert V.; Blodgett, Kevin

    2006-01-01

    The scientific method is the principal methodology by which biological knowledge is gained and disseminated. As fundamental as the scientific method may be, its historical development is poorly understood, its definition is variable, and its deployment is uneven. Scientific progress may occur without the strictures imposed by the formal…

  5. Measurements of the degree of development of rigor mortis as an indicator of stress in slaughtered pigs.

    Science.gov (United States)

    Warriss, P D; Brown, S N; Knowles, T G

    2003-12-13

    The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.

  6. Computational Simulations and the Scientific Method

    Science.gov (United States)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  7. Measured data from the Avery Island Site C heater test

    International Nuclear Information System (INIS)

    Waldman, H.; Stickney, R.G.

    1984-11-01

    Over the past six years, a comprehensive field testing program was conducted in the Avery Island salt mine. Three single canister heater tests were included in the testing program. Specifically, electric heaters, which simulate canisters of heat-generating nuclear waste, were placed in the floor of the Avery Island salt mine, and measurements were made of the response of the salt to heating. These tests were in operation by June 1978. One of the three heater tests, Site C, operated for a period of 1858 days and was decommissioned during July and August 1983. This data report presents the temperature and displacement data gathered during the operation and decommissioning of the Site C heater test. The purpose of this data report is to transmit the data to the scientific community. Rigorous analysis and interpretation of the data are considered beyond the scope of a data report. 6 references, 21 figures, 1 table

  8. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series

    Science.gov (United States)

    Liang, X. S.

    2017-12-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming

  9. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG.

    Science.gov (United States)

    Cowley, Benjamin U; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

  10. Scientific Verification Test of Orbitec Deployable Vegetable Production System for Salad Crop Growth on ISS- Gas Exchange System design and function

    Science.gov (United States)

    Eldemire, Ashleigh

    2007-01-01

    The ability to produce and maintain salad crops during long term missions would be a great benefit to NASA; the renewable food supply would save cargo space, weight and money. The ambient conditions of previous ground controlled crop plant experiments do not reflect the microgravity and high CO2 concentrations present during orbit. It has been established that microgravity does not considerably alter plant growth. (Monje, Stutte, Chapman, 2005). To support plants in a space-craft environment efficient and effective lighting and containment units are necessary. Three lighting systems were previously evaluated for radish growth in ambient air; fluorescent lamps in an Orbitec Biomass Production System Educational (BPSE), a combination of red, blue, and green LED's in a Deployable Vegetable Production System (Veggie), and a combination of red and blue LED's in a Veggie. When mass measurements compared the entire possible growing area vs. power consumed by the respective units, the Veggies clearly exceeded the BPSE indicating that the LED units were a more resource efficient means of growing radishes under ambient conditions in comparison with fluorescent lighting. To evaluate the most productive light treatment system for a long term space mission a more closely simulated ISS environment is necessary. To induce a CO2 dense atmosphere inside the Veggie's and BPSE a gas exchange system has been developed to maintain a range of 1000-1200 ppm CO2 during a 21-day light treatment experiment. This report details the design and function of the gas exchange system. The rehabilitation, trouble shooting, maintenance and testing of the gas exchange system have been my major assignments. I have also contributed to the planting, daily measurements and harvesting of the radish crops 21-day light treatment verification test.

  11. Use of the Rigor Mortis Process as a Tool for Better Understanding of Skeletal Muscle Physiology: Effect of the Ante-Mortem Stress on the Progression of Rigor Mortis in Brook Charr (Salvelinus fontinalis).

    Science.gov (United States)

    Diouf, Boucar; Rioux, Pierre

    1999-01-01

    Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…

  12. A rigorous pole representation of multilevel cross sections and its practical applications

    International Nuclear Information System (INIS)

    Hwang, R.N.

    1987-01-01

    In this article a rigorous method for representing the multilevel cross sections and its practical applications are described. It is a generalization of the rationale suggested by de Saussure and Perez for the s-wave resonances. A computer code WHOPPER has been developed to convert the Reich-Moore parameters into the pole and residue parameters in momentum space. Sample calculations have been carried out to illustrate that the proposed method preserves the rigor of the Reich-Moore cross sections exactly. An analytical method has been developed to evaluate the pertinent Doppler-broadened line shape functions. A discussion is presented on how to minimize the number of pole parameters so that the existing reactor codes can be best utilized

  13. Rigorous simulations of a helical core fiber by the use of transformation optics formalism.

    Science.gov (United States)

    Napiorkowski, Maciej; Urbanczyk, Waclaw

    2014-09-22

    We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.

  14. Estimation of the time since death--reconsidering the re-establishment of rigor mortis.

    Science.gov (United States)

    Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter

    2013-01-01

    In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.

  15. Quality of nuchal translucency measurements correlates with broader aspects of program rigor and culture of excellence.

    Science.gov (United States)

    Evans, Mark I; Krantz, David A; Hallahan, Terrence; Sherwin, John; Britt, David W

    2013-01-01

    To determine if nuchal translucency (NT) quality correlates with the extent to which clinics vary in rigor and quality control. We correlated NT performance quality (bias and precision) of 246,000 patients with two alternative measures of clinic culture - % of cases for whom nasal bone (NB) measurements were performed and % of requisitions correctly filled for race-ethnicity and weight. When requisition errors occurred in 5% (33%), the curve lowered to 0.93 MoM (p 90%, MoM was 0.99 compared to those quality exists independent of individual variation in NT quality, and two divergent indices of program rigor are associated with NT quality. Quality control must be program wide, and to effect continued improvement in the quality of NT results across time, the cultures of clinics must become a target for intervention. Copyright © 2013 S. Karger AG, Basel.

  16. A Framework for Rigorously Identifying Research Gaps in Qualitative Literature Reviews

    DEFF Research Database (Denmark)

    Müller-Bloch, Christoph; Kranz, Johann

    2015-01-01

    Identifying research gaps is a fundamental goal of literature reviewing. While it is widely acknowledged that literature reviews should identify research gaps, there are no methodological guidelines for how to identify research gaps in qualitative literature reviews ensuring rigor and replicability....... Our study addresses this gap and proposes a framework that should help scholars in this endeavor without stifling creativity. To develop the framework we thoroughly analyze the state-of-the-art procedure of identifying research gaps in 40 recent literature reviews using a grounded theory approach....... Based on the data, we subsequently derive a framework for identifying research gaps in qualitative literature reviews and demonstrate its application with an example. Our results provide a modus operandi for identifying research gaps, thus enabling scholars to conduct literature reviews more rigorously...

  17. Derivation of basic equations for rigorous dynamic simulation of cryogenic distillation column for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Kinoshita, Masahiro; Naruse, Yuji

    1981-08-01

    The basic equations are derived for rigorous dynamic simulation of cryogenic distillation columns for hydrogen isotope separation. The model accounts for such factors as differences in latent heat of vaporization among the six isotopic species of molecular hydrogen, decay heat of tritium, heat transfer through the column wall and nonideality of the solutions. Provision is also made for simulation of columns with multiple feeds and multiple sidestreams. (author)

  18. Rigor mortis development in turkey breast muscle and the effect of electrical stunning.

    Science.gov (United States)

    Alvarado, C Z; Sams, A R

    2000-11-01

    Rigor mortis development in turkey breast muscle and the effect of electrical stunning on this process are not well characterized. Some electrical stunning procedures have been known to inhibit postmortem (PM) biochemical reactions, thereby delaying the onset of rigor mortis in broilers. Therefore, this study was designed to characterize rigor mortis development in stunned and unstunned turkeys. A total of 154 turkey toms in two trials were conventionally processed at 20 to 22 wk of age. Turkeys were either stunned with a pulsed direct current (500 Hz, 50% duty cycle) at 35 mA (40 V) in a saline bath for 12 seconds or left unstunned as controls. At 15 min and 1, 2, 4, 8, 12, and 24 h PM, pectoralis samples were collected to determine pH, R-value, L* value, sarcomere length, and shear value. In Trial 1, the samples obtained for pH, R-value, and sarcomere length were divided into surface and interior samples. There were no significant differences between the surface and interior samples among any parameters measured. Muscle pH significantly decreased over time in stunned and unstunned birds through 2 h PM. The R-values increased to 8 h PM in unstunned birds and 24 h PM in stunned birds. The L* values increased over time, with no significant differences after 1 h PM for the controls and 2 h PM for the stunned birds. Sarcomere length increased through 2 h PM in the controls and 12 h PM in the stunned fillets. Cooked meat shear values decreased through the 1 h PM deboning time in the control fillets and 2 h PM in the stunned fillets. These results suggest that stunning delayed the development of rigor mortis through 2 h PM, but had no significant effect on the measured parameters at later time points, and that deboning turkey breasts at 2 h PM or later will not significantly impair meat tenderness.

  19. Rigorous Integration of Non-Linear Ordinary Differential Equations in Chebyshev Basis

    Czech Academy of Sciences Publication Activity Database

    Dzetkulič, Tomáš

    2015-01-01

    Roč. 69, č. 1 (2015), s. 183-205 ISSN 1017-1398 R&D Projects: GA MŠk OC10048; GA ČR GD201/09/H057 Institutional research plan: CEZ:AV0Z10300504 Keywords : Initial value problem * Rigorous integration * Taylor model * Chebyshev basis Subject RIV: IN - Informatics, Computer Science Impact factor: 1.366, year: 2015

  20. A rigorous proof of the Landau-Peierls formula and much more

    DEFF Research Database (Denmark)

    Briet, Philippe; Cornean, Horia; Savoie, Baptiste

    2012-01-01

    We present a rigorous mathematical treatment of the zero-field orbital magnetic susceptibility of a non-interacting Bloch electron gas, at fixed temperature and density, for both metals and semiconductors/insulators. In particular, we obtain the Landau-Peierls formula in the low temperature and d...... and density limit as conjectured by Kjeldaas and Kohn (Phys Rev 105:806–813, 1957)....

  1. Association Between Maximal Skin Dose and Breast Brachytherapy Outcome: A Proposal for More Rigorous Dosimetric Constraints

    International Nuclear Information System (INIS)

    Cuttino, Laurie W.; Heffernan, Jill; Vera, Robyn; Rosu, Mihaela; Ramakrishnan, V. Ramesh; Arthur, Douglas W.

    2011-01-01

    Purpose: Multiple investigations have used the skin distance as a surrogate for the skin dose and have shown that distances 4.05 Gy/fraction. Conclusion: The initial skin dose recommendations have been based on safe use and the avoidance of significant toxicity. The results from the present study have suggested that patients might further benefit if more rigorous constraints were applied and if the skin dose were limited to 120% of the prescription dose.

  2. Re-establishment of rigor mortis: evidence for a considerably longer post-mortem time span.

    Science.gov (United States)

    Crostack, Chiara; Sehner, Susanne; Raupach, Tobias; Anders, Sven

    2017-07-01

    Re-establishment of rigor mortis following mechanical loosening is used as part of the complex method for the forensic estimation of the time since death in human bodies and has formerly been reported to occur up to 8-12 h post-mortem (hpm). We recently described our observation of the phenomenon in up to 19 hpm in cases with in-hospital death. Due to the case selection (preceding illness, immobilisation), transfer of these results to forensic cases might be limited. We therefore examined 67 out-of-hospital cases of sudden death with known time points of death. Re-establishment of rigor mortis was positive in 52.2% of cases and was observed up to 20 hpm. In contrast to the current doctrine that a recurrence of rigor mortis is always of a lesser degree than its first manifestation in a given patient, muscular rigidity at re-establishment equalled or even exceeded the degree observed before dissolving in 21 joints. Furthermore, this is the first study to describe that the phenomenon appears to be independent of body or ambient temperature.

  3. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    Science.gov (United States)

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  4. Application of the rigorous method to x-ray and neutron beam scattering on rough surfaces

    International Nuclear Information System (INIS)

    Goray, Leonid I.

    2010-01-01

    The paper presents a comprehensive numerical analysis of x-ray and neutron scattering from finite-conducting rough surfaces which is performed in the frame of the boundary integral equation method in a rigorous formulation for high ratios of characteristic dimension to wavelength. The single integral equation obtained involves boundary integrals of the single and double layer potentials. A more general treatment of the energy conservation law applicable to absorption gratings and rough mirrors is considered. In order to compute the scattering intensity of rough surfaces using the forward electromagnetic solver, Monte Carlo simulation is employed to average the deterministic diffraction grating efficiency due to individual surfaces over an ensemble of realizations. Some rules appropriate for numerical implementation of the theory at small wavelength-to-period ratios are presented. The difference between the rigorous approach and approximations can be clearly seen in specular reflectances of Au mirrors with different roughness parameters at wavelengths where grazing incidence occurs at close to or larger than the critical angle. This difference may give rise to wrong estimates of rms roughness and correlation length if they are obtained by comparing experimental data with calculations. Besides, the rigorous approach permits taking into account any known roughness statistics and allows exact computation of diffuse scattering.

  5. Test

    DEFF Research Database (Denmark)

    Bendixen, Carsten

    2014-01-01

    Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers.......Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers....

  6. Biclustering via optimal re-ordering of data matrices in systems biology: rigorous methods and comparative studies

    Directory of Open Access Journals (Sweden)

    Feng Xiao-Jiang

    2008-10-01

    Full Text Available Abstract Background The analysis of large-scale data sets via clustering techniques is utilized in a number of applications. Biclustering in particular has emerged as an important problem in the analysis of gene expression data since genes may only jointly respond over a subset of conditions. Biclustering algorithms also have important applications in sample classification where, for instance, tissue samples can be classified as cancerous or normal. Many of the methods for biclustering, and clustering algorithms in general, utilize simplified models or heuristic strategies for identifying the "best" grouping of elements according to some metric and cluster definition and thus result in suboptimal clusters. Results In this article, we present a rigorous approach to biclustering, OREO, which is based on the Optimal RE-Ordering of the rows and columns of a data matrix so as to globally minimize the dissimilarity metric. The physical permutations of the rows and columns of the data matrix can be modeled as either a network flow problem or a traveling salesman problem. Cluster boundaries in one dimension are used to partition and re-order the other dimensions of the corresponding submatrices to generate biclusters. The performance of OREO is tested on (a metabolite concentration data, (b an image reconstruction matrix, (c synthetic data with implanted biclusters, and gene expression data for (d colon cancer data, (e breast cancer data, as well as (f yeast segregant data to validate the ability of the proposed method and compare it to existing biclustering and clustering methods. Conclusion We demonstrate that this rigorous global optimization method for biclustering produces clusters with more insightful groupings of similar entities, such as genes or metabolites sharing common functions, than other clustering and biclustering algorithms and can reconstruct underlying fundamental patterns in the data for several distinct sets of data matrices arising

  7. A Draft Conceptual Framework of Relevant Theories to Inform Future Rigorous Research on Student Service-Learning Outcomes

    Science.gov (United States)

    Whitley, Meredith A.

    2014-01-01

    While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…

  8. Feedback for relatedness and competence : Can feedback in blended learning contribute to optimal rigor, basic needs, and motivation?

    NARCIS (Netherlands)

    Bombaerts, G.; Nickel, P.J.

    2017-01-01

    We inquire how peer and tutor feedback influences students' optimal rigor, basic needs and motivation. We analyze questionnaires from two courses in two subsequent years. We conclude that feedback in blended learning can contribute to rigor and basic needs, but it is not clear from our data what

  9. Optimal correction and design parameter search by modern methods of rigorous global optimization

    International Nuclear Information System (INIS)

    Makino, K.; Berz, M.

    2011-01-01

    Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle

  10. Material Control and Accounting (MC and A) System Upgrades and Performance Testing at the Russian Federal Nuclear Center-All-Russian Scientific Research Institute of Experimental Physics (RFNC-VNIIEF)

    International Nuclear Information System (INIS)

    Bushmelev, Vadim; Viktorov, Vladimir; Zhikharev, Stanislav; Yuferev, Vladimir; Singh, Surinder Paul; Kuzminski, Jozef; Hogan, Kevin; McKisson, Jacquelin

    2008-01-01

    The All-Russian Scientific Research Institute of Experimental Physics (VNIIEF), founded in 1946 at the historic village of Sarov, in Nizhniy Novgorod Oblast, is the largest nuclear research center in the Rosatom complex. In the framework of international collaboration, the United States (US) Department of Energy/National Nuclear Security Agency, in cooperation with US national laboratories, on the one hand, Rosatom and VNIIEF on the other hand, have focused their cooperative efforts to upgrade the existing material protection control and accountability system to prevent unauthorized access to the nuclear material. In this paper we will discuss the present status of material control and accounting (MC and A) system upgrades and the preliminary results from a pilot program on the MC and A system performance testing that was recently conducted at one technical area.

  11. Scientific Integrity and Consensus in the Intergovernmental Panel on Climate Change Assessment Process

    Science.gov (United States)

    Barrett, K.

    2017-12-01

    Scientific integrity is the hallmark of any assessment and is a paramount consideration in the Intergovernmental Panel on Climate Change (IPCC) assessment process. Procedures are in place for rigorous scientific review and to quantify confidence levels and uncertainty in the communication of key findings. However, the IPCC is unique in that its reports are formally accepted by governments through consensus agreement. This presentation will present the unique requirements of the IPCC intergovernmental assessment and discuss the advantages and challenges of its approach.

  12. Age and Scientific Performance.

    Science.gov (United States)

    Cole, Stephen

    1979-01-01

    The long-standing belief that age is negatively associated with scientific productivity and creativity is shown to be based upon incorrect analysis of data. Studies reported in this article suggest that the relationship between age and scientific performance is influenced by the operation of the reward system. (Author)

  13. Scientific Notation Watercolor

    Science.gov (United States)

    Linford, Kyle; Oltman, Kathleen; Daisey, Peggy

    2016-01-01

    (Purpose) The purpose of this paper is to describe visual literacy, an adapted version of Visual Thinking Strategy (VTS), and an art-integrated middle school mathematics lesson about scientific notation. The intent of this lesson was to provide students with a real life use of scientific notation and exponents, and to motivate them to apply their…

  14. Rediscovering the scientific ethos

    DEFF Research Database (Denmark)

    Djørup, Stine

    The doctoral dissertation discusses some of the moral standards of good scientific practice that areunderexposed in the literature. In particular, attempts are made to correct the conceptual confusionsurrounding the norm of 'disinterestedness' in science (‘uhildethed’), and the norm of scientific...

  15. Testing strategies in mutagenicity and genetic toxicology: an appraisal of the guidelines of the European Scientific Committee for Cosmetics and Non-Food Products for the evaluation of hair dyes.

    Science.gov (United States)

    Kirkland, D J; Henderson, L; Marzin, D; Müller, L; Parry, J M; Speit, G; Tweats, D J; Williams, G M

    2005-12-30

    The European Scientific Committee on Cosmetics and Non-Food Products (SCCNFP) guideline for testing of hair dyes for genotoxic/mutagenic/carcinogenic potential has been reviewed. The battery of six in vitro tests recommended therein differs substantially from the batteries of two or three in vitro tests recommended in other guidelines. Our evaluation of the chemical types used in hair dyes and comparison with other guidelines for testing a wide range of chemical substances, lead to the conclusion that potential genotoxic activity may effectively be determined by the application of a limited number of well-validated test systems that are capable of detecting induced gene mutations and structural and numerical chromosomal changes. We conclude that highly effective screening for genotoxicity of hair dyes can be achieved by the use of three assays, namely the bacterial gene mutation assay, the mammalian cell gene mutation assay (mouse lymphoma tk assay preferred) and the in vitro micronucleus assay. These need to be combined with metabolic activation systems optimised for the individual chemical types. Recent published evidence [D. Kirkland, M. Aardema, L. Henderson, L. Müller, Evaluation of the ability of a battery of three in vitro genotoxicity tests to discriminate rodent carcinogens and non-carcinogens. I. Sensitivity, specificity and relative predictivity, Mutat. Res. 584 (2005) 1-256] suggests that our recommended three tests will detect all known genotoxic carcinogens, and that increasing the number of in vitro assays further would merely reduce specificity (increase false positives). Of course there may be occasions when standard tests need to be modified to take account of special situations such as a specific pathway of biotransformation, but this should be considered as part of routine testing. It is clear that individual dyes and any other novel ingredients should be tested in this three-test battery. However, new products are formed on the scalp by

  16. Unmet Need: Improving mHealth Evaluation Rigor to Build the Evidence Base.

    Science.gov (United States)

    Mookherji, Sangeeta; Mehl, Garrett; Kaonga, Nadi; Mechael, Patricia

    2015-01-01

    mHealth-the use of mobile technologies for health-is a growing element of health system activity globally, but evaluation of those activities remains quite scant, and remains an important knowledge gap for advancing mHealth activities. In 2010, the World Health Organization and Columbia University implemented a small-scale survey to generate preliminary data on evaluation activities used by mHealth initiatives. The authors describe self-reported data from 69 projects in 29 countries. The majority (74%) reported some sort of evaluation activity, primarily nonexperimental in design (62%). The authors developed a 6-point scale of evaluation rigor comprising information on use of comparison groups, sample size calculation, data collection timing, and randomization. The mean score was low (2.4); half (47%) were conducting evaluations with a minimum threshold (4+) of rigor, indicating use of a comparison group, while less than 20% had randomized the mHealth intervention. The authors were unable to assess whether the rigor score was appropriate for the type of mHealth activity being evaluated. What was clear was that although most data came from mHealth projects pilots aimed for scale-up, few had designed evaluations that would support crucial decisions on whether to scale up and how. Whether the mHealth activity is a strategy to improve health or a tool for achieving intermediate outcomes that should lead to better health, mHealth evaluations must be improved to generate robust evidence for cost-effectiveness assessment and to allow for accurate identification of the contribution of mHealth initiatives to health systems strengthening and the impact on actual health outcomes.

  17. Effect of muscle restraint on sheep meat tenderness with rigor mortis at 18°C.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Wells, Robyn W

    2002-02-01

    The effect on shear force of skeletal restraint and removing muscles from lamb m. longissimus thoracis et lumborum (LT) immediately after slaughter and electrical stimulation was undertaken at a rigor temperature of 18°C (n=15). The temperature of 18°C was achieved through chilling of electrically stimulated sheep carcasses in air at 12°C, air flow 1-1.5 ms(-2). In other groups, the muscle was removed at 2.5 h post-mortem and either wrapped or left non-wrapped before being placed back on the carcass to follow carcass cooling regimes. Following rigor mortis, the meat was aged for 0, 16, 40 and 65 h at 15°C and frozen. For the non-stimulated samples, the meat was aged for 0, 12, 36 and 60 h before being frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1 × 1 cm cross-section. Commencement of ageing was considered to take place at rigor mortis and this was taken as zero aged meat. There were no significant differences in the rate of tenderisation and initial shear force for all treatments. The 23% cook loss was similar for all wrapped and non-wrapped situations and the values decreased slightly with longer ageing durations. Wrapping was shown to mimic meat left intact on the carcass, as it prevented significant prerigor shortening. Such techniques allows muscles to be removed and placed in a controlled temperature environment to enable precise studies of ageing processes.

  18. Biomedical text mining for research rigor and integrity: tasks, challenges, directions.

    Science.gov (United States)

    Kilicoglu, Halil

    2017-06-13

    An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.

  19. A Computing Environment to Support Repeatable Scientific Big Data Experimentation of World-Wide Scientific Literature

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Kulesz, James J [ORNL; Abercrombie, Robert K [ORNL; Kruse, Kara L [ORNL

    2015-01-01

    A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oak Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .

  20. "On Clocks and Clouds:" Confirming and Interpreting Climate Models as Scientific Hypotheses (Invited)

    Science.gov (United States)

    Donner, L.

    2009-12-01

    The certainty of climate change projected under various scenarios of emissions using general circulation models is an issue of vast societal importance. Unlike numerical weather prediction, a problem to which general circulation models are also applied, projected climate changes usually lie outside of the range of external forcings for which the models generating these changes have been directly evaluated. This presentation views climate models as complex scientific hypotheses and thereby frames these models within a well-defined process of both advancing scientific knowledge and recognizing its limitations. Karl Popper's Logik der Forschung (The Logic of Scientific Discovery, 1934) and 1965 essay “On Clocks and Clouds” capture well the methodologies and challenges associated with constructing climate models. Indeed, the process of a problem situation generating tentative theories, refined by error elimination, characterizes aptly the routine of general circulation model development. Limitations on certainty arise from the distinction Popper perceived in types of natural processes, which he exemplified by clocks, capable of exact measurement, and clouds, subject only to statistical approximation. Remarkably, the representation of clouds in general circulation models remains the key uncertainty in understanding atmospheric aspects of climate change. The asymmetry of hypothesis falsification by negation and much vaguer development of confidence in hypotheses consistent with some of their implications is an important practical challenge to confirming climate models. The presentation will discuss the ways in which predictions made by climate models for observable aspects of the present and past climate can be regarded as falsifiable hypotheses. The presentation will also include reasons why “passing” these tests does not provide complete confidence in predictions about the future by climate models. Finally, I will suggest that a “reductionist” view, in

  1. Rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets

    International Nuclear Information System (INIS)

    Yang, Z.R.

    1993-10-01

    We have exactly calculated the rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets (SC's) by means of graph expansion and a combinatorial approach and investigated the asymptotic behaviour in the limit of long distance. The result show there is no long range correlation between spins at any finite temperature which indicates no existence of phase transition and thus finally confirms the conclusion produced by the renormalization group method and other physical arguments. (author). 7 refs, 6 figs

  2. An efficient and rigorous thermodynamic library and optimal-control of a cryogenic air separation unit

    DEFF Research Database (Denmark)

    Gaspar, Jozsef; Ritschel, Tobias Kasper Skovborg; Jørgensen, John Bagterp

    2017-01-01

    -linear model based control to achieve optimal techno-economic performance. Accordingly, this work presents a computationally efficient and novel approach for solving a tray-by-tray equilibrium model and its implementation for open-loop optimal-control of a cryogenic distillation column. Here, the optimisation...... objective is to reduce the cost of compression in a volatile electricity market while meeting the production requirements, i.e. product flow rate and purity. This model is implemented in Matlab and uses the ThermoLib rigorous thermodynamic library. The present work represents a first step towards plant...

  3. A study into first-year engineering education success using a rigorous mixed methods approach

    DEFF Research Database (Denmark)

    van den Bogaard, M.E.D.; de Graaff, Erik; Verbraek, Alexander

    2015-01-01

    The aim of this paper is to combine qualitative and quantitative research methods into rigorous research into student success. Research methods have weaknesses that can be overcome by clever combinations. In this paper we use a situated study into student success as an example of how methods...... using statistical techniques. The main elements of the model were student behaviour and student disposition, which were influenced by the students’ perceptions of the education environment. The outcomes of the qualitative studies were useful in interpreting the outcomes of the structural equation...

  4. Supersymmetry and the Parisi-Sourlas dimensional reduction: A rigorous proof

    International Nuclear Information System (INIS)

    Klein, A.; Landau, L.J.; Perez, J.F.

    1984-01-01

    Functional integrals that are formally related to the average correlation functions of a classical field theory in the presence of random external sources are given a rigorous meaning. Their dimensional reduction to the Schwinger functions of the corresponding quantum field theory in two fewer dimensions is proven. This is done by reexpressing those functional integrals as expectations of a supersymmetric field theory. The Parisi-Sourlas dimensional reduction of a supersymmetric field theory to a usual quantum field theory in two fewer dimensions is proven. (orig.)

  5. A Rigorous Treatment of Energy Extraction from a Rotating Black Hole

    Science.gov (United States)

    Finster, F.; Kamran, N.; Smoller, J.; Yau, S.-T.

    2009-05-01

    The Cauchy problem is considered for the scalar wave equation in the Kerr geometry. We prove that by choosing a suitable wave packet as initial data, one can extract energy from the black hole, thereby putting supperradiance, the wave analogue of the Penrose process, into a rigorous mathematical framework. We quantify the maximal energy gain. We also compute the infinitesimal change of mass and angular momentum of the black hole, in agreement with Christodoulou’s result for the Penrose process. The main mathematical tool is our previously derived integral representation of the wave propagator.

  6. Pre-rigor temperature and the relationship between lamb tenderisation, free water production, bound water and dry matter.

    Science.gov (United States)

    Devine, Carrick; Wells, Robyn; Lowe, Tim; Waller, John

    2014-01-01

    The M. longissimus from lambs electrically stimulated at 15 min post-mortem were removed after grading, wrapped in polythene film and held at 4 (n=6), 7 (n=6), 15 (n=6, n=8) and 35°C (n=6), until rigor mortis then aged at 15°C for 0, 4, 24 and 72 h post-rigor. Centrifuged free water increased exponentially, and bound water, dry matter and shear force decreased exponentially over time. Decreases in shear force and increases in free water were closely related (r(2)=0.52) and were unaffected by pre-rigor temperatures. © 2013.

  7. Lactate Test

    Science.gov (United States)

    ... Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin ... ency/article/000391.htm . (2002 January, Updated). Lactate (Liquid) Reagent Set. Pointe Scientific, Inc. [On-line Reagent ...

  8. Applying the scientific method to small catchment studies: Areview of the Panola Mountain experience

    Science.gov (United States)

    Hooper, R.P.

    2001-01-01

    A hallmark of the scientific method is its iterative application to a problem to increase and refine the understanding of the underlying processes controlling it. A successful iterative application of the scientific method to catchment science (including the fields of hillslope hydrology and biogeochemistry) has been hindered by two factors. First, the scale at which controlled experiments can be performed is much smaller than the scale of the phenomenon of interest. Second, computer simulation models generally have not been used as hypothesis-testing tools as rigorously as they might have been. Model evaluation often has gone only so far as evaluation of goodness of fit, rather than a full structural analysis, which is more useful when treating the model as a hypothesis. An iterative application of a simple mixing model to the Panola Mountain Research Watershed is reviewed to illustrate the increase in understanding gained by this approach and to discern general principles that may be applicable to other studies. The lessons learned include the need for an explicitly stated conceptual model of the catchment, the definition of objective measures of its applicability, and a clear linkage between the scale of observations and the scale of predictions. Published in 2001 by John Wiley & Sons. Ltd.

  9. Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound

    Science.gov (United States)

    Shiraishi, Naoto; Tajima, Hiroyasu

    2017-08-01

    A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.

  10. Rigorous RG Algorithms and Area Laws for Low Energy Eigenstates in 1D

    Science.gov (United States)

    Arad, Itai; Landau, Zeph; Vazirani, Umesh; Vidick, Thomas

    2017-11-01

    One of the central challenges in the study of quantum many-body systems is the complexity of simulating them on a classical computer. A recent advance (Landau et al. in Nat Phys, 2015) gave a polynomial time algorithm to compute a succinct classical description for unique ground states of gapped 1D quantum systems. Despite this progress many questions remained unsolved, including whether there exist efficient algorithms when the ground space is degenerate (and of polynomial dimension in the system size), or for the polynomially many lowest energy states, or even whether such states admit succinct classical descriptions or area laws. In this paper we give a new algorithm, based on a rigorously justified RG type transformation, for finding low energy states for 1D Hamiltonians acting on a chain of n particles. In the process we resolve some of the aforementioned open questions, including giving a polynomial time algorithm for poly( n) degenerate ground spaces and an n O(log n) algorithm for the poly( n) lowest energy states (under a mild density condition). For these classes of systems the existence of a succinct classical description and area laws were not rigorously proved before this work. The algorithms are natural and efficient, and for the case of finding unique ground states for frustration-free Hamiltonians the running time is {\\tilde{O}(nM(n))} , where M( n) is the time required to multiply two n × n matrices.

  11. "Snow White" Coating Protects SpaceX Dragon's Trunk Against Rigors of Space

    Science.gov (United States)

    McMahan, Tracy

    2013-01-01

    He described it as "snow white." But NASA astronaut Don Pettit was not referring to the popular children's fairy tale. Rather, he was talking about the white coating of the Space Exploration Technologies Corp. (SpaceX) Dragon spacecraft that reflected from the International Space Station s light. As it approached the station for the first time in May 2012, the Dragon s trunk might have been described as the "fairest of them all," for its pristine coating, allowing Pettit to clearly see to maneuver the robotic arm to grab the Dragon for a successful nighttime berthing. This protective thermal control coating, developed by Alion Science and Technology Corp., based in McLean, Va., made its bright appearance again with the March 1 launch of SpaceX's second commercial resupply mission. Named Z-93C55, the coating was applied to the cargo portion of the Dragon to protect it from the rigors of space. "For decades, Alion has produced coatings to protect against the rigors of space," said Michael Kenny, senior chemist with Alion. "As space missions evolved, there was a growing need to dissipate electrical charges that build up on the exteriors of spacecraft, or there could be damage to the spacecraft s electronics. Alion's research led us to develop materials that would meet this goal while also providing thermal controls. The outcome of this research was Alion's proprietary Z-93C55 coating."

  12. Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II

    Energy Technology Data Exchange (ETDEWEB)

    George J. Koperna Jr.; Vello A. Kuuskraa; David E. Riestenberg; Aiysha Sultana; Tyler Van Leeuwen

    2009-06-01

    This report serves as the final technical report and users manual for the 'Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II SBIR project. Advanced Resources International has developed a screening tool by which users can technically screen, assess the storage capacity and quantify the costs of CO2 storage in four types of CO2 storage reservoirs. These include CO2-enhanced oil recovery reservoirs, depleted oil and gas fields (non-enhanced oil recovery candidates), deep coal seems that are amenable to CO2-enhanced methane recovery, and saline reservoirs. The screening function assessed whether the reservoir could likely serve as a safe, long-term CO2 storage reservoir. The storage capacity assessment uses rigorous reservoir simulation models to determine the timing, ultimate storage capacity, and potential for enhanced hydrocarbon recovery. Finally, the economic assessment function determines both the field-level and pipeline (transportation) costs for CO2 sequestration in a given reservoir. The screening tool has been peer reviewed at an Electrical Power Research Institute (EPRI) technical meeting in March 2009. A number of useful observations and recommendations emerged from the Workshop on the costs of CO2 transport and storage that could be readily incorporated into a commercial version of the Screening Tool in a Phase III SBIR.

  13. A Generalized Method for the Comparable and Rigorous Calculation of the Polytropic Efficiencies of Turbocompressors

    Science.gov (United States)

    Dimitrakopoulos, Panagiotis

    2018-03-01

    The calculation of polytropic efficiencies is a very important task, especially during the development of new compression units, like compressor impellers, stages and stage groups. Such calculations are also crucial for the determination of the performance of a whole compressor. As processors and computational capacities have substantially been improved in the last years, the need for a new, rigorous, robust, accurate and at the same time standardized method merged, regarding the computation of the polytropic efficiencies, especially based on thermodynamics of real gases. The proposed method is based on the rigorous definition of the polytropic efficiency. The input consists of pressure and temperature values at the end points of the compression path (suction and discharge), for a given working fluid. The average relative error for the studied cases was 0.536 %. Thus, this high-accuracy method is proposed for efficiency calculations related with turbocompressors and their compression units, especially when they are operating at high power levels, for example in jet engines and high-power plants.

  14. Differential algebras with remainder and rigorous proofs of long-term stability

    International Nuclear Information System (INIS)

    Berz, Martin

    1997-01-01

    It is shown how in addition to determining Taylor maps of general optical systems, it is possible to obtain rigorous interval bounds for the remainder term of the n-th order Taylor expansion. To this end, the three elementary operations of addition, multiplication, and differentiation in the Differential Algebraic approach are augmented by suitable interval operations in such a way that a remainder bound of the sum, product, and derivative is obtained from the Taylor polynomial and remainder bound of the operands. The method can be used to obtain bounds for the accuracy with which a Taylor map represents the true map of the particle optical system. In a more general sense, it is also useful for a variety of other numerical problems, including rigorous global optimization of highly complex functions. Combined with methods to obtain pseudo-invariants of repetitive motion and extensions of the Lyapunov- and Nekhoroshev stability theory, the latter can be used to guarantee stability for storage rings and other weakly nonlinear systems

  15. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  16. Rigorous Combination of GNSS and VLBI: How it Improves Earth Orientation and Reference Frames

    Science.gov (United States)

    Lambert, S. B.; Richard, J. Y.; Bizouard, C.; Becker, O.

    2017-12-01

    Current reference series (C04) of the International Earth Rotation and Reference Systems Service (IERS) are produced by a weighted combination of Earth orientation parameters (EOP) time series built up by combination centers of each technique (VLBI, GNSS, Laser ranging, DORIS). In the future, we plan to derive EOP from a rigorous combination of the normal equation systems of the four techniques.We present here the results of a rigorous combination of VLBI and GNSS pre-reduced, constraint-free, normal equations with the DYNAMO geodetic analysis software package developed and maintained by the French GRGS (Groupe de Recherche en GeÌodeÌsie Spatiale). The used normal equations are those produced separately by the IVS and IGS combination centers to which we apply our own minimal constraints.We address the usefulness of such a method with respect to the classical, a posteriori, combination method, and we show whether EOP determinations are improved.Especially, we implement external validations of the EOP series based on comparison with geophysical excitation and examination of the covariance matrices. Finally, we address the potential of the technique for the next generation celestial reference frames, which are currently determined by VLBI only.

  17. Rigorous high-precision enclosures of fixed points and their invariant manifolds

    Science.gov (United States)

    Wittig, Alexander N.

    The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by

  18. Virtual reconstruction of the roman villa in La Quintilla (Lorca based on the existing physical evidence and related scientific comparisons as a reference

    Directory of Open Access Journals (Sweden)

    Sebastián F. Ramallos Asensio

    2013-11-01

    Full Text Available The virtual reconstruction of archaeological sites using computer graphics is a very important tool for the verification or refutation of hypotheses in scientific research. It also is an excellent way to spread awareness Heritage with realistic images of scientific rigor.

  19. Scientific meeting abstracts

    International Nuclear Information System (INIS)

    1999-01-01

    The document is a collection of the scientific meeting abstracts in the fields of nuclear physics, medical sciences, chemistry, agriculture, environment, engineering, different aspects of energy and presents research done in 1999 in these fields

  20. Identifying Strategic Scientific Opportunities

    Science.gov (United States)

    As NCI's central scientific strategy office, CRS collaborates with the institute's divisions, offices, and centers to identify research opportunities to advance NCI's vision for the future of cancer research.

  1. Visualization in scientific computing

    National Research Council Canada - National Science Library

    Nielson, Gregory M; Shriver, Bruce D; Rosenblum, Lawrence J

    1990-01-01

    The purpose of this text is to provide a reference source to scientists, engineers, and students who are new to scientific visualization or who are interested in expanding their knowledge in this subject...

  2. The Scientific Enterprise

    Indian Academy of Sciences (India)

    Srimath

    The phrase pre-modern scientific may be used to describe certain attitudes and ..... But unfortunately, in the general atmosphere of poor education and collective fears .... present day science and technology that old time beliefs and traditional ...

  3. WITHER SCIENTIFIC AND TECHNOLOGICAL

    African Journals Online (AJOL)

    No library or information service and especially in a developing .... Good public relations, consultancy services including bilateral and ... project proposal for the creation of a scientific and technological information ... For example, in 1995 the ...

  4. Shaping a Scientific Self

    DEFF Research Database (Denmark)

    Andrade-Molina, Melissa; Valero, Paola

    us to understand how a truth is reproduced, circulating among diverse fields of human knowledge. Also it will show why we accept and reproduce a particular discourse. Finally, we state Euclidean geometry as a truth that circulates in scientific discourse and performs a scientific self. We unfold...... the importance of having students following the path of what schools perceive a real scientist is, no to become a scientist, but to become a logical thinker, a problem solver, a productive citizen who uses reason....

  5. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  6. Open scientific communication urged

    Science.gov (United States)

    Richman, Barbara T.

    In a report released last week the National Academy of Sciences' Panel on Scientific Communication and National Security concluded that the ‘limited and uncertain benefits’ of controls on the dissemination of scientific and technological research are ‘outweighed by the importance of scientific progress, which open communication accelerates, to the overall welfare of the nation.’ The 18-member panel, chaired by Dale R. Corson, president emeritus of Cornell University, was created last spring (Eos, April 20, 1982, p. 241) to examine the delicate balance between open dissemination of scientific and technical information and the U.S. government's desire to protect scientific and technological achievements from being translated into military advantages for our political adversaries.The panel dealt almost exclusively with the relationship between the United States and the Soviet Union but noted that there are ‘clear problems in scientific communication and national security involving Third World countries.’ Further study of this matter is necessary.

  7. Cost evaluation of cellulase enzyme for industrial-scale cellulosic ethanol production based on rigorous Aspen Plus modeling.

    Science.gov (United States)

    Liu, Gang; Zhang, Jian; Bao, Jie

    2016-01-01

    Cost reduction on cellulase enzyme usage has been the central effort in the commercialization of fuel ethanol production from lignocellulose biomass. Therefore, establishing an accurate evaluation method on cellulase enzyme cost is crucially important to support the health development of the future biorefinery industry. Currently, the cellulase cost evaluation methods were complicated and various controversial or even conflict results were presented. To give a reliable evaluation on this important topic, a rigorous analysis based on the Aspen Plus flowsheet simulation in the commercial scale ethanol plant was proposed in this study. The minimum ethanol selling price (MESP) was used as the indicator to show the impacts of varying enzyme supply modes, enzyme prices, process parameters, as well as enzyme loading on the enzyme cost. The results reveal that the enzyme cost drives the cellulosic ethanol price below the minimum profit point when the enzyme is purchased from the current industrial enzyme market. An innovative production of cellulase enzyme such as on-site enzyme production should be explored and tested in the industrial scale to yield an economically sound enzyme supply for the future cellulosic ethanol production.

  8. Semi-physical Simulation of the Airborne InSAR based on Rigorous Geometric Model and Real Navigation Data

    Science.gov (United States)

    Changyong, Dou; Huadong, Guo; Chunming, Han; yuquan, Liu; Xijuan, Yue; Yinghui, Zhao

    2014-03-01

    Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc.

  9. Semi-physical Simulation of the Airborne InSAR based on Rigorous Geometric Model and Real Navigation Data

    International Nuclear Information System (INIS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; Yuquan, Liu; Xijuan, Yue; Yinghui, Zhao

    2014-01-01

    Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc

  10. Dosimetric effects of edema in permanent prostate seed implants: a rigorous solution

    International Nuclear Information System (INIS)

    Chen Zhe; Yue Ning; Wang Xiaohong; Roberts, Kenneth B.; Peschel, Richard; Nath, Ravinder

    2000-01-01

    Purpose: To derive a rigorous analytic solution to the dosimetric effects of prostate edema so that its impact on the conventional pre-implant and post-implant dosimetry can be studied for any given radioactive isotope and edema characteristics. Methods and Materials: The edema characteristics observed by Waterman et al (Int. J. Rad. Onc. Biol. Phys, 41:1069-1077; 1998) was used to model the time evolution of the prostate and the seed locations. The total dose to any part of prostate tissue from a seed implant was calculated analytically by parameterizing the dose fall-off from a radioactive seed as a single inverse power function of distance, with proper account of the edema-induced time evolution. The dosimetric impact of prostate edema was determined by comparing the dose calculated with full consideration of prostate edema to that calculated with the conventional dosimetry approach where the seed locations and the target volume are assumed to be stationary. Results: A rigorous analytic solution on the relative dosimetric effects of prostate edema was obtained. This solution proved explicitly that the relative dosimetric effects of edema, as found in the previous numerical studies by Yue et. al. (Int. J. Radiat. Oncol. Biol. Phys. 43, 447-454, 1999), are independent of the size and the shape of the implant target volume and are independent of the number and the locations of the seeds implanted. It also showed that the magnitude of relative dosimetric effects is independent of the location of dose evaluation point within the edematous target volume. It implies that the relative dosimetric effects of prostate edema are universal with respect to a given isotope and edema characteristic. A set of master tables for the relative dosimetric effects of edema were obtained for a wide range of edema characteristics for both 125 I and 103 Pd prostate seed implants. Conclusions: A rigorous analytic solution of the relative dosimetric effects of prostate edema has been

  11. Scientific-creative thinking and academic achievement

    Directory of Open Access Journals (Sweden)

    Rosario Bermejo

    2014-07-01

    Full Text Available The aim of this work is to study the relationship between scientific-creative thinking construct and academic performance in a sample of adolescents. In addition, the scientific-creative thinking instrument’s reliability will be tested. The sample was composed of 98 students (aged between 12-16 years old attending to a Secondary School in Murcia Region (Spain. The used instruments were: a the Scientific-Creative Thinking Test designed by Hu and Adey (2002, which was adapted to the Spanish culture by the High Abilities research team at Murcia University. The test is composed of 7 task based in the Scientific Creative Structure Model. It assesses the dimensions fluency, flexibility and originality; b The General and Factorial Intelligence Test (IGF/5r; Yuste, 2002, which assess the abilities of general intelligence and logic reasoning, verbal reasoning, numerical reasoning and spatial reasoning; c Students’ academic achievement by domains (scientific-technological, social-linguistic and artistic was collected. The results showed positive and statistical significant correlations between the scientific-creative tasks and academic achievement of different domains.

  12. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    Science.gov (United States)

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (Prigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (Prigor fillets (37.8 ± 0.8) and had significantly lower (Prigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  13. Paulo Leminski : um estudo sobre o rigor e o relaxo em suas poesias

    OpenAIRE

    Dhynarte de Borba e Albuquerque

    2005-01-01

    O trabalho examina a trajetória da poesia de Paulo Leminski, buscando estabelecer os termos do humor, da pesquisa metalingüística e do eu-lírico, e que não deixa de exibir traços da poesia marginal dos 70. Um autor que trabalhou com a busca do rigor concretista mediante os procedimentos da fala cotidiana mais ou menos relaxada. O esforço poético do curitibano Leminski é uma “linha que nunca termina” – ele escreveu poesias, romances, peças de publicidade, letras de música e fez traduções. Em t...

  14. Rigorous decoupling between edge states in frustrated spin chains and ladders

    Science.gov (United States)

    Chepiga, Natalia; Mila, Frédéric

    2018-05-01

    We investigate the occurrence of exact zero modes in one-dimensional quantum magnets of finite length that possess edge states. Building on conclusions first reached in the context of the spin-1/2 X Y chain in a field and then for the spin-1 J1-J2 Heisenberg model, we show that the development of incommensurate correlations in the bulk invariably leads to oscillations in the sign of the coupling between edge states, and hence to exact zero energy modes at the crossing points where the coupling between the edge states rigorously vanishes. This is true regardless of the origin of the frustration (e.g., next-nearest-neighbor coupling or biquadratic coupling for the spin-1 chain), of the value of the bulk spin (we report on spin-1/2, spin-1, and spin-2 examples), and of the value of the edge-state emergent spin (spin-1/2 or spin-1).

  15. Guidelines for conducting rigorous health care psychosocial cross-cultural/language qualitative research.

    Science.gov (United States)

    Arriaza, Pablo; Nedjat-Haiem, Frances; Lee, Hee Yun; Martin, Shadi S

    2015-01-01

    The purpose of this article is to synthesize and chronicle the authors' experiences as four bilingual and bicultural researchers, each experienced in conducting cross-cultural/cross-language qualitative research. Through narrative descriptions of experiences with Latinos, Iranians, and Hmong refugees, the authors discuss their rewards, challenges, and methods of enhancing rigor, trustworthiness, and transparency when conducting cross-cultural/cross-language research. The authors discuss and explore how to effectively manage cross-cultural qualitative data, how to effectively use interpreters and translators, how to identify best methods of transcribing data, and the role of creating strong community relationships. The authors provide guidelines for health care professionals to consider when engaging in cross-cultural qualitative research.

  16. Release of major ions during rigor mortis development in kid Longissimus dorsi muscle.

    Science.gov (United States)

    Feidt, C; Brun-Bellut, J

    1999-01-01

    Ionic strength plays an important role in post mortem muscle changes. Its increase is due to ion release during the development of rigor mortis. Twelve alpine kids were used to study the effects of chilling and meat pH on ion release. Free ions were measured in Longissimus dorsi muscle by capillary electrophoresis after water extraction. All free ion concentrations increased after death, but there were differences between ions. Temperature was not a factor affecting ion release in contrast to ultimate pH value. Three release mechanisms are believed to coexist: a passive binding to proteins, which stops as pH decreases, an active segregation which stops as ATP disappears and the production of metabolites due to anaerobic glycolysis.

  17. Rigorous approach to the comparison between experiment and theory in Casimir force measurements

    International Nuclear Information System (INIS)

    Klimchitskaya, G L; Chen, F; Decca, R S; Fischbach, E; Krause, D E; Lopez, D; Mohideen, U; Mostepanenko, V M

    2006-01-01

    In most experiments on the Casimir force the comparison between measurement data and theory was done using the concept of the root-mean-square deviation, a procedure that has been criticized in the literature. Here we propose a special statistical analysis which should be performed separately for the experimental data and for the results of the theoretical computations. In so doing, the random, systematic and total experimental errors are found as functions of separation, taking into account the distribution laws for each error at 95% confidence. Independently, all theoretical errors are combined to obtain the total theoretical error at the same confidence. Finally, the confidence interval for the differences between theoretical and experimental values is obtained as a function of separation. This rigorous approach is applied to two recent experiments on the Casimir effect

  18. Direct integration of the S-matrix applied to rigorous diffraction

    International Nuclear Information System (INIS)

    Iff, W; Lindlein, N; Tishchenko, A V

    2014-01-01

    A novel Fourier method for rigorous diffraction computation at periodic structures is presented. The procedure is based on a differential equation for the S-matrix, which allows direct integration of the S-matrix blocks. This results in a new method in Fourier space, which can be considered as a numerically stable and well-parallelizable alternative to the conventional differential method based on T-matrix integration and subsequent conversions from the T-matrices to S-matrix blocks. Integration of the novel differential equation in implicit manner is expounded. The applicability of the new method is shown on the basis of 1D periodic structures. It is clear however, that the new technique can also be applied to arbitrary 2D periodic or periodized structures. The complexity of the new method is O(N 3 ) similar to the conventional differential method with N being the number of diffraction orders. (fast track communication)

  19. Rigorous description of holograms of particles illuminated by an astigmatic elliptical Gaussian beam

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Y J; Ren, K F; Coetmellec, S; Lebrun, D, E-mail: fang.ren@coria.f [UMR 6614/CORIA, CNRS and Universite et INSA de Rouen Avenue de l' Universite BP 12, 76801 Saint Etienne du Rouvray (France)

    2009-02-01

    The digital holography is a non-intrusive optical metrology and well adapted for the measurement of the size and velocity field of particles in the spray of a fluid. The simplified model of an opaque disk is often used in the treatment of the diagrams and therefore the refraction and the third dimension diffraction of the particle are not taken into account. We present in this paper a rigorous description of the holographic diagrams and evaluate the effects of the refraction and the third dimension diffraction by comparison to the opaque disk model. It is found that the effects are important when the real part of the refractive index is near unity or the imaginary part is non zero but small.

  20. A new method for deriving rigorous results on ππ scattering

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.

    1979-06-01

    We develop a new approach to the problem of constraining the ππ scattering amplitudes by means of the axiomatically proved properties of unitarity, analyticity and crossing symmetry. The method is based on the solution of an extremal problem on a convex set of analytic functions and provides a global description of the domain of values taken by any finite number of partial waves at an arbitrary set of unphysical energies, compatible with unitarity, the bounds at complex energies derived from generalized dispersion relations and the crossing integral relations. From this doma domain we obtain new absolute bounds for the amplitudes as well as rigorous correlations between the values of various partial waves. (author)

  1. Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation

    Science.gov (United States)

    Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe

    2018-04-01

    In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.

  2. Study design elements for rigorous quasi-experimental comparative effectiveness research.

    Science.gov (United States)

    Maciejewski, Matthew L; Curtis, Lesley H; Dowd, Bryan

    2013-03-01

    Quasi-experiments are likely to be the workhorse study design used to generate evidence about the comparative effectiveness of alternative treatments, because of their feasibility, timeliness, affordability and external validity compared with randomized trials. In this review, we outline potential sources of discordance in results between quasi-experiments and experiments, review study design choices that can improve the internal validity of quasi-experiments, and outline innovative data linkage strategies that may be particularly useful in quasi-experimental comparative effectiveness research. There is an urgent need to resolve the debate about the evidentiary value of quasi-experiments since equal consideration of rigorous quasi-experiments will broaden the base of evidence that can be brought to bear in clinical decision-making and governmental policy-making.

  3. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    Science.gov (United States)

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  4. A rigorous phenomenological analysis of the ππ scattering lengths

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.; Sararu, M.

    1979-11-01

    The constraining power of the present experimental data, combined with the general theoretical knowledge about ππ scattering, upon the scattering lengths of this process, is investigated by means of a rigorous functional method. We take as input the experimental phase shifts and make no hypotheses about the high energy behaviour of the amplitudes, using only absolute bounds derived from axiomatic field theory and exact consequences of crossing symmetry. In the simplest application of the method, involving only the π 0 π 0 S-wave, we explored numerically a number of values proposed by various authors for the scattering lengths a 0 and a 2 and found that no one appears to be especially favoured. (author)

  5. New tools for Content Innovation and data sharing: Enhancing reproducibility and rigor in biomechanics research.

    Science.gov (United States)

    Guilak, Farshid

    2017-03-21

    We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Early rigorous control interventions can largely reduce dengue outbreak magnitude: experience from Chaozhou, China.

    Science.gov (United States)

    Liu, Tao; Zhu, Guanghu; He, Jianfeng; Song, Tie; Zhang, Meng; Lin, Hualiang; Xiao, Jianpeng; Zeng, Weilin; Li, Xing; Li, Zhihao; Xie, Runsheng; Zhong, Haojie; Wu, Xiaocheng; Hu, Wenbiao; Zhang, Yonghui; Ma, Wenjun

    2017-08-02

    Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6%) in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI) data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed) model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250). The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.

  7. Rigorous Multicomponent Reactive Separations Modelling: Complete Consideration of Reaction-Diffusion Phenomena

    International Nuclear Information System (INIS)

    Ahmadi, A.; Meyer, M.; Rouzineau, D.; Prevost, M.; Alix, P.; Laloue, N.

    2010-01-01

    This paper gives the first step of the development of a rigorous multicomponent reactive separation model. Such a model is highly essential to further the optimization of acid gases removal plants (CO 2 capture, gas treating, etc.) in terms of size and energy consumption, since chemical solvents are conventionally used. Firstly, two main modelling approaches are presented: the equilibrium-based and the rate-based approaches. Secondly, an extended rate-based model with rigorous modelling methodology for diffusion-reaction phenomena is proposed. The film theory and the generalized Maxwell-Stefan equations are used in order to characterize multicomponent interactions. The complete chain of chemical reactions is taken into account. The reactions can be kinetically controlled or at chemical equilibrium, and they are considered for both liquid film and liquid bulk. Thirdly, the method of numerical resolution is described. Coupling the generalized Maxwell-Stefan equations with chemical equilibrium equations leads to a highly non-linear Differential-Algebraic Equations system known as DAE index 3. The set of equations is discretized with finite-differences as its integration by Gear method is complex. The resulting algebraic system is resolved by the Newton- Raphson method. Finally, the present model and the associated methods of numerical resolution are validated for the example of esterification of methanol. This archetype non-electrolytic system permits an interesting analysis of reaction impact on mass transfer, especially near the phase interface. The numerical resolution of the model by Newton-Raphson method gives good results in terms of calculation time and convergence. The simulations show that the impact of reactions at chemical equilibrium and that of kinetically controlled reactions with high kinetics on mass transfer is relatively similar. Moreover, the Fick's law is less adapted for multicomponent mixtures where some abnormalities such as counter

  8. Rigor mortis development at elevated temperatures induces pale exudative turkey meat characteristics.

    Science.gov (United States)

    McKee, S R; Sams, A R

    1998-01-01

    Development of rigor mortis at elevated post-mortem temperatures may contribute to turkey meat characteristics that are similar to those found in pale, soft, exudative pork. To evaluate this effect, 36 Nicholas tom turkeys were processed at 19 wk of age and placed in water at 40, 20, and 0 C immediately after evisceration. Pectoralis muscle samples were taken at 15 min, 30 min, 1 h, 2 h, and 4 h post-mortem and analyzed for R-value (an indirect measure of adenosine triphosphate), glycogen, pH, color, and sarcomere length. At 4 h, the remaining intact Pectoralis muscle was harvested, and aged on ice 23 h, and analyzed for drip loss, cook loss, shear values, and sarcomere length. By 15 min post-mortem, the 40 C treatment had higher R-values, which persisted through 4 h. By 1 h, the 40 C treatment pH and glycogen levels were lower than the 0 C treatment; however, they did not differ from those of the 20 C treatment. Increased L* values indicated that color became more pale by 2 h post-mortem in the 40 C treatment when compared to the 20 and 0 C treatments. Drip loss, cook loss, and shear value were increased whereas sarcomere lengths were decreased as a result of the 40 C treatment. These findings suggested that elevated post-mortem temperatures during processing resulted in acceleration of rigor mortis and biochemical changes in the muscle that produced pale, exudative meat characteristics in turkey.

  9. Early rigorous control interventions can largely reduce dengue outbreak magnitude: experience from Chaozhou, China

    Directory of Open Access Journals (Sweden)

    Tao Liu

    2017-08-01

    Full Text Available Abstract Background Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? Methods We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6% in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. Results A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250. The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. Conclusions This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.

  10. Standards and Methodological Rigor in Pulmonary Arterial Hypertension Preclinical and Translational Research.

    Science.gov (United States)

    Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien

    2018-03-30

    Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from

  11. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  12. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  13. Undergraduate Medical Academic Performance is Improved by Scientific Training

    Science.gov (United States)

    Zhang, Lili; Zhang, Wei; Wu, Chong; Liu, Zhongming; Cai, Yunfei; Cao, Xingguo; He, Yushan; Liu, Guoxiang; Miao, Hongming

    2017-01-01

    The effect of scientific training on course learning in undergraduates is still controversial. In this study, we investigated the academic performance of undergraduate students with and without scientific training. The results show that scientific training improves students' test scores in general medical courses, such as biochemistry and…

  14. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  15. Scientific collaboratories in higher education

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.; Li, Bin

    2003-01-01

    Scientific collaboratories hold the promise of providing students access to specialized scientific instruments, data and experts, enabling learning opportunities perhaps otherwise not available. However, evaluation of scientific collaboratories in higher education has lagged behind...

  16. Making better scientific figures

    Science.gov (United States)

    Hawkins, Ed; McNeall, Doug

    2016-04-01

    In the words of the UK government chief scientific adviser "Science is not finished until it's communicated" (Walport 2013). The tools to produce good visual communication have never been so easily accessible to scientists as at the present. Correspondingly, it has never been easier to produce and disseminate poor graphics. In this presentation, we highlight some good practice and offer some practical advice in preparing scientific figures for presentation to peers or to the public. We identify common mistakes in visualisation, including some made by the authors, and offer some good reasons not to trust defaults in graphics software. In particular, we discuss the use of colour scales and share our experiences in running a social media campaign (http://tiny.cc/endrainbow) to replace the "rainbow" (also "jet", or "spectral") colour scale as the default in (climate) scientific visualisation.

  17. Plagiarism in scientific publishing.

    Science.gov (United States)

    Masic, Izet

    2012-12-01

    Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader's own scientific contribution. There is no general regulation of control of

  18. PLAGIARISM IN SCIENTIFIC PUBLISHING

    Science.gov (United States)

    Masic, Izet

    2012-01-01

    Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader’s own scientific contribution. There is no general regulation of control of

  19. NASA's Scientific Visualization Studio

    Science.gov (United States)

    Mitchell, Horace G.

    2003-01-01

    Since 1988, the Scientific Visualization Studio(SVS) at NASA Goddard Space Flight Center has produced scientific visualizations of NASA s scientific research and remote sensing data for public outreach. These visualizations take the form of images, animations, and end-to-end systems and have been used in many venues: from the network news to science programs such as NOVA, from museum exhibits at the Smithsonian to White House briefings. This presentation will give an overview of the major activities and accomplishments of the SVS, and some of the most interesting projects and systems developed at the SVS will be described. Particular emphasis will be given to the practices and procedures by which the SVS creates visualizations, from the hardware and software used to the structures and collaborations by which products are designed, developed, and delivered to customers. The web-based archival and delivery system for SVS visualizations at svs.gsfc.nasa.gov will also be described.

  20. Effects of Pre and Post-Rigor Marinade Injection on Some Quality Parameters of Longissimus Dorsi Muscles

    Science.gov (United States)

    Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem

    2018-01-01

    Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282

  1. Recording Scientific Knowledge

    International Nuclear Information System (INIS)

    Bowker, Geof

    2006-01-01

    The way we record knowledge, and the web of technical, formal, and social practices that surrounds it, inevitably affects the knowledge that we record. The ways we hold knowledge about the past - in handwritten manuscripts, in printed books, in file folders, in databases - shape the kind of stories we tell about that past. In this talk, I look at how over the past two hundred years, information technology has affected the nature and production of scientific knowledge. Further, I explore ways in which the emergent new cyberinfrastructure is changing our relationship to scientific practice.

  2. Usability in Scientific Databases

    Directory of Open Access Journals (Sweden)

    Ana-Maria Suduc

    2012-07-01

    Full Text Available Usability, most often defined as the ease of use and acceptability of a system, affects the users' performance and their job satisfaction when working with a machine. Therefore, usability is a very important aspect which must be considered in the process of a system development. The paper presents several numerical data related to the history of the scientific research of the usability of information systems, as it is viewed in the information provided by three important scientific databases, Science Direct, ACM Digital Library and IEEE Xplore Digital Library, at different queries related to this field.

  3. The human repeated insult patch test in the 21st century: a commentary.

    Science.gov (United States)

    Basketter, David A

    2009-01-01

    The human repeated insult patch test (HRIPT) is over half a century old, but is still used in several countries as a confirmatory test in the safety evaluation of skin sensitizers. This is despite the criticism it receives from an ethical perspective and regarding the scientific validity of such testing. In this commentary, the HRIPT is reviewed, with emphasis on ethical aspects and where the test can, and cannot, contribute in a scientifically meaningful manner to safety evaluation. It is concluded that where there is a specific rationale for testing, for example, to substantiate a no-effect level for a sensitizing chemical or to ensure that matrix effects are not making an unexpected contribution to sensitizing potency, then rigorous independent review may confirm that an HRIPT is ethical and scientifically justifiable. The possibility that sensitization may be induced in volunteers dictates that HRIPTs should be conducted rarely and in cases where the benefits overwhelmingly outweigh the risk. However, for the very large majority of HRIPTs conducted concerning the risk of skin sensitization, there is neither scientific justification nor any other merit.

  4. How do we determine the impact of e-cigarettes on cigarette smoking cessation or reduction? Review and recommendations for answering the research question with scientific rigor.

    Science.gov (United States)

    Villanti, Andrea C; Feirman, Shari P; Niaura, Raymond S; Pearson, Jennifer L; Glasser, Allison M; Collins, Lauren K; Abrams, David B

    2018-03-01

    To propose a hierarchy of methodological criteria to consider when determining whether a study provides sufficient information to answer the question of whether e-cigarettes can facilitate cigarette smoking cessation or reduction. A PubMed search to 1 February 2017 was conducted of all studies related to e-cigarettes and smoking cessation or reduction. Australia, Europe, Iran, Korea, New Zealand and the United States. 91 articles. Coders organized studies according to six proposed methodological criteria: (1) examines outcome of interest (cigarette abstinence or reduction), (2) assesses e-cigarette use for cessation as exposure of interest, (3) employs appropriate control/comparison groups, (4) ensures that measurement of exposure precedes the outcome, (5) evaluates dose and duration of the exposure and (6) evaluates the type and quality of the e-cigarette used. Twenty-four papers did not examine the outcomes of interest. Forty did not assess the specific reason for e-cigarette use as an exposure of interest. Twenty papers did not employ prospective study designs with appropriate comparison groups. The few observational studies meeting some of the criteria (duration, type, use for cessation) triangulated with findings from three randomized trials to suggest that e-cigarettes can help adult smokers quit or reduce cigarette smoking. Only a small proportion of studies seeking to address the effect of e-cigarettes on smoking cessation or reduction meet a set of proposed quality standards. Those that do are consistent with randomized controlled trial evidence in suggesting that e-cigarettes can help with smoking cessation or reduction. © 2017 Society for the Study of Addiction.

  5. Scientific annual report 1972

    International Nuclear Information System (INIS)

    This is a report on scientific research at DESY in 1972. The activities in the field of electron-nucleon scattering, photoproduction and synchrotron radiation get a special mention. It is also reported on the work on the double storage ring as well as on the extension to the synchrotron. (WL/LN) [de

  6. Funding scientific open access

    International Nuclear Information System (INIS)

    Canessa, E.; Fonda, C.; Zennaro, M.

    2006-11-01

    In order to reduce the knowledge divide, more Open Access Journals (OAJ) are needed in all languages and scholarly subject areas that exercise peer-review or editorial quality control. To finance needed costs, it is discussed why and how to sell target specific advertisement by associating ads to given scientific keywords. (author)

  7. Scientific Report 2007

    International Nuclear Information System (INIS)

    2009-09-01

    This annual scientific report gives an concise overview of research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2007. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

  8. Report of scientific results

    International Nuclear Information System (INIS)

    1978-01-01

    The findings of R+D activities of the HMI radiation chemistry department in the fields of pulsed radiolysis, reaction kinematics, insulators and plastics are presented as well as the scientific publications and lectures of HMI staff and visitors including theoretical contributions, theses and dissertations, and conference papers. (HK) [de

  9. Scientific Report 2001

    International Nuclear Information System (INIS)

    2002-04-01

    The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2001. The report discusses progress and main achievements in four principal areas: Radiation Protection, Radioactive Waste and Clean-up, Reactor Safety and the BR2 Reactor

  10. Scientific Report 2005

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-04-15

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2005. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research.

  11. Dorky Poll Scientific Fears

    CERN Multimedia

    2008-01-01

    The questions posed in yesterday's posts about hopes for 2008 were half of what we were asked by the Powers That Be. The other half: What scientific development do you fear you'll be blogging or reading about in 2008?

  12. Scientific Report 2004

    International Nuclear Information System (INIS)

    2005-04-01

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2004. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

  13. Scientific Report 2004

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-04-01

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2004. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research.

  14. Is risk analysis scientific?

    Science.gov (United States)

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  15. Scientific Medical Journal

    African Journals Online (AJOL)

    Scientific Medical Journal: an official journal of Egyptian Medical Education provides a forum for dissemination of knowledge, exchange of ideas, inform of exchange of ideas, information and experience among workers, investigators and clinicians in all disciplines of medicine with emphasis on its treatment and prevention.

  16. Scientific Report 2001

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-04-01

    The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2001. The report discusses progress and main achievements in four principal areas: Radiation Protection, Radioactive Waste and Clean-up, Reactor Safety and the BR2 Reactor.

  17. Assessing Scientific Performance.

    Science.gov (United States)

    Weiner, John M.; And Others

    1984-01-01

    A method for assessing scientific performance based on relationships displayed numerically in published documents is proposed and illustrated using published documents in pediatric oncology for the period 1979-1982. Contributions of a major clinical investigations group, the Childrens Cancer Study Group, are analyzed. Twenty-nine references are…

  18. Scientific Report 2006

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2006. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research.

  19. Scientific Report 2006

    International Nuclear Information System (INIS)

    2007-09-01

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2006. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

  20. Scientific Report 2003

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-01-01

    The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2003. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge, and fusion research.

  1. 3 CFR - Scientific Integrity

    Science.gov (United States)

    2010-01-01

    ... information in policymaking. The selection of scientists and technology professionals for positions in the... Administration on a wide range of issues, including improvement of public health, protection of the environment... technological findings and conclusions. If scientific and technological information is developed and used by the...

  2. Scientific annual report 1973

    International Nuclear Information System (INIS)

    A report is given on the scientific research at DESY in 1973, which included the first storage of electrons in the double storage ring DORIS. Also mentioned are the two large spectrometers PLUTO and DASP, and experiments relating to elementary particles, synchrotron radiation, and the improvement of the equipment are described. (WL/AK) [de

  3. Scientific Report 2005

    International Nuclear Information System (INIS)

    2006-04-01

    The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2005. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

  4. Scientific Report 2003

    International Nuclear Information System (INIS)

    2004-01-01

    The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2003. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge, and fusion research

  5. Mario Bunge's Scientific Realism

    Science.gov (United States)

    Cordero, Alberto

    2012-01-01

    This paper presents and comments on Mario Bunge's scientific realism. After a brief introduction in Sects. 1 and 2 outlines Bunge's conception of realism. Focusing on the case of quantum mechanics, Sect. 3 explores how his approach plays out for problematic theories. Section 4 comments on Bunge's project against the background of the current…

  6. Toward executable scientific publications

    NARCIS (Netherlands)

    Strijkers, R.J.; Cushing, R.; Vasyunin, D.; Laat, C. de; Belloum, A.S.Z.; Meijer, R.J.

    2011-01-01

    Reproducibility of experiments is considered as one of the main principles of the scientific method. Recent developments in data and computation intensive science, i.e. e-Science, and state of the art in Cloud computing provide the necessary components to preserve data sets and re-run code and

  7. 2003 Scientific Technological Report

    International Nuclear Information System (INIS)

    Prado Cuba, A.; Gayoso Caballero, C.; Robles Nique, A.; Olivera Lescano, P.

    2004-08-01

    This annual scientific-technological report provides an overview of research and development activities at Peruvian Institute of Nuclear Energy (IPEN) during the period from 1 january to 31 december, 2003. This report includes 54 papers divided in 9 subject matters: physics and nuclear chemistry, nuclear engineering, materials science, radiochemistry, industrial applications, medical applications, environmental applications, protection and radiological safety, and management aspects

  8. Scientific Tourism in Armenia

    Science.gov (United States)

    Tashchyan, Davit

    2016-12-01

    The Scientific Tourism is relatively new direction in the world, however it already has managed to gain great popularity. As it is, it has arisen in 1980s, but its ideological basis comes from the earliest periods of the human history. In Armenia, it is a completely new phenomenon and still not-understandable for many people. At global level, the Scientific Tourism has several definitions: for example, as explains the member of the scientific tourist centre of Zlovlen Mrs. Pichelerova "The essence of the scientific tourism is based on the provision of the educational, cultural and entertainment needs of a group of people of people who are interested in the same thing", which in our opinion is a very comprehensive and discreet definition. We also have our own views on this type of tourism. Our philosophy is that by keeping the total principles, we put the emphasis on the strengthening of science-individual ties. Our main emphasis is on the scientific-experimental tourism. But this does not mean that we do not take steps to other forms of tourism. Studying the global experience and combining it with our resources, we are trying to get a new interdisciplinary science, which will bring together a number of different professionals as well as individuals, and as a result will have a new lore. It is in this way that an astronomer will become an archaeologist, an archaeologist will become an astrophysicist, etc. Speaking on interdisciplinary sciences, it's worth mentioning that in recent years, the role of interdisciplinary sciences at global level every day is being considered more and more important. In these terms, tourism is an excellent platform for the creation of interdisciplinary sciences and, therefore, the preparation of corresponding scholars. Nevertheless, scientific tourism is very important for the revelation, appreciation and promotion of the country's historical-cultural heritage and scientific potential. Let us not forget either that tourism in all its

  9. Layout optimization of DRAM cells using rigorous simulation model for NTD

    Science.gov (United States)

    Jeon, Jinhyuck; Kim, Shinyoung; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Kuechler, Bernd; Zimmermann, Rainer; Muelders, Thomas; Klostermann, Ulrich; Schmoeller, Thomas; Do, Mun-hoe; Choi, Jung-Hoe

    2014-03-01

    scanning electron microscope (SEM) measurements. High resist impact and difficult model data acquisition demand for a simulation model that hat is capable of extrapolating reliably beyond its calibration dataset. We use rigorous simulation models to provide that predictive performance. We have discussed the need of a rigorous mask optimization process for DRAM contact cell layout yielding mask layouts that are optimal in process performance, mask manufacturability and accuracy. In this paper, we have shown the step by step process from analytical illumination source derivation, a NTD and application tailored model calibration to layout optimization such as OPC and SRAF placement. Finally the work has been verified with simulation and experimental results on wafer.

  10. Turning Scientific Presentations into Stories

    Science.gov (United States)

    Aruffo, Christopher

    2015-01-01

    To increase students' confidence in giving scientific presentations, students were shown how to present scientific findings as a narrative story. Students who were preparing to give a scientific talk attended a workshop in which they were encouraged to experience the similarities between telling a personal anecdote and presenting scientific data.…

  11. A CUMULATIVE MIGRATION METHOD FOR COMPUTING RIGOROUS TRANSPORT CROSS SECTIONS AND DIFFUSION COEFFICIENTS FOR LWR LATTICES WITH MONTE CARLO

    Energy Technology Data Exchange (ETDEWEB)

    Zhaoyuan Liu; Kord Smith; Benoit Forget; Javier Ortensi

    2016-05-01

    A new method for computing homogenized assembly neutron transport cross sections and dif- fusion coefficients that is both rigorous and computationally efficient is proposed in this paper. In the limit of a homogeneous hydrogen slab, the new method is equivalent to the long-used, and only-recently-published CASMO transport method. The rigorous method is used to demonstrate the sources of inaccuracy in the commonly applied “out-scatter” transport correction. It is also demonstrated that the newly developed method is directly applicable to lattice calculations per- formed by Monte Carlo and is capable of computing rigorous homogenized transport cross sections for arbitrarily heterogeneous lattices. Comparisons of several common transport cross section ap- proximations are presented for a simple problem of infinite medium hydrogen. The new method has also been applied in computing 2-group diffusion data for an actual PWR lattice from BEAVRS benchmark.

  12. From scientifically based research to evidence based learning

    Directory of Open Access Journals (Sweden)

    Rosa Cera

    2016-02-01

    Full Text Available This essay is a reflection on the peculiarities of the scientifically based research and on the distinctive elements of the EBL (evidence based learning, methodology used in the study on the “Relationship between Metacognition, Self-efficacy and Self-regulation in Learning”. The EBL method, based on the standardization of data, explains how the students’ learning experience can be considered as a set of “data” and can be used to explain how and when the research results can be considered generalizable and transferable to other learning situations. The reflections present in this study have also allowed us to illustrate the impact that its results have had on the micro and macro level of reality. They helped to fill in the gaps concerning the learning/teaching processes, contributed to the enrichment of the scientific literature on this subject and allowed to establish standards through rigorous techniques such as systematic reviews and meta-analysis.

  13. Bridging Ayurveda with evidence-based scientific approaches in medicine.

    Science.gov (United States)

    Patwardhan, Bhushan

    2014-01-01

    This article reviews contemporary approaches for bridging Ayurveda with evidence-based medicine. In doing so, the author presents a pragmatic assessment of quality, methodology and extent of scientific research in Ayurvedic medicine. The article discusses the meaning of evidence and indicates the need to adopt epistemologically sensitive methods and rigorous experimentation using modern science. The author critically analyzes the status of Ayurvedic medicine based on personal observations, peer interactions and published research. This review article concludes that traditional knowledge systems like Ayurveda and modern scientific evidence-based medicine should be integrated. The author advocates that Ayurvedic researchers should develop strategic collaborations with innovative initiatives like 'Horizon 2020' involving predictive, preventive and personalized medicine (PPPM).

  14. Phase I Final Scientific Report

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Xijia [National Energy Technology Lab. (NETL), Albany, OR (United States); Fetvedt, Jeremy [National Energy Technology Lab. (NETL), Albany, OR (United States); Dimmig, Walker [National Energy Technology Lab. (NETL), Albany, OR (United States)

    2017-10-15

    This Final Scientific Report addresses the accomplishments achieved during Phase I of DE- FE0023985, Coal Syngas Combustor Development for Supercritical CO2 Power Cycles. The primary objective of the project was to develop a coal syngas-fueled combustor design for use with high-pressure, high-temperature, oxy-fuel, supercritical CO2 power cycles, with particular focus given to the conditions required by the Allam Cycle. The primary goals, from the Statement of Project Objectives, were to develop: (1) a conceptual design of a syngas-fueled combustor-turbine block for a 300MWe high-pressure, oxy-fuel, sCO2 power plant; (2) the preliminary design of a 5MWt test combustor; and (3) the definition of a combustor test program. Accomplishments for each of these goals are discussed in this report.

  15. Rigorous classification and carbon accounting principles for low and Zero Carbon Cities

    International Nuclear Information System (INIS)

    Kennedy, Scott; Sgouridis, Sgouris

    2011-01-01

    A large number of communities, new developments, and regions aim to lower their carbon footprint and aspire to become 'zero carbon' or 'Carbon Neutral.' Yet there are neither clear definitions for the scope of emissions that such a label would address on an urban scale, nor is there a process for qualifying the carbon reduction claims. This paper addresses the question of how to define a zero carbon, Low Carbon, or Carbon Neutral urban development by proposing hierarchical emissions categories with three levels: Internal Emissions based on the geographical boundary, external emissions directly caused by core municipal activities, and internal or external emissions due to non-core activities. Each level implies a different carbon management strategy (eliminating, balancing, and minimizing, respectively) needed to meet a Net Zero Carbon designation. The trade-offs, implications, and difficulties of implementing carbon debt accounting based upon these definitions are further analyzed. - Highlights: → A gap exists in comprehensive and standardized accounting methods for urban carbon emissions. → We propose a comprehensive and rigorous City Framework for Carbon Accounting (CiFCA). → CiFCA classifies emissions hierarchically with corresponding carbon management strategies. → Adoption of CiFCA allows for meaningful comparisons of claimed performance of eco-cities.

  16. Revisiting the constant growth angle: Estimation and verification via rigorous thermal modeling

    Science.gov (United States)

    Virozub, Alexander; Rasin, Igal G.; Brandon, Simon

    2008-12-01

    Methods for estimating growth angle ( θgr) values, based on the a posteriori analysis of directionally solidified material (e.g. drops) often involve assumptions of negligible gravitational effects as well as a planar solid/liquid interface during solidification. We relax both of these assumptions when using experimental drop shapes from the literature to estimate the relevant growth angles at the initial stages of solidification. Assumed to be constant, we use these values as input into a rigorous heat transfer and solidification model of the growth process. This model, which is shown to reproduce the experimental shape of a solidified sessile water drop using the literature value of θgr=0∘, yields excellent agreement with experimental profiles using our estimated values for silicon ( θgr=10∘) and germanium ( θgr=14.3∘) solidifying on an isotropic crystalline surface. The effect of gravity on the solidified drop shape is found to be significant in the case of germanium, suggesting that gravity should either be included in the analysis or that care should be taken that the relevant Bond number is truly small enough in each measurement. The planar solidification interface assumption is found to be unjustified. Although this issue is important when simulating the inflection point in the profile of the solidified water drop, there are indications that solidified drop shapes (at least in the case of silicon) may be fairly insensitive to the shape of this interface.

  17. A Rigorous Investigation on the Ground State of the Penson-Kolb Model

    Science.gov (United States)

    Yang, Kai-Hua; Tian, Guang-Shan; Han, Ru-Qi

    2003-05-01

    By using either numerical calculations or analytical methods, such as the bosonization technique, the ground state of the Penson-Kolb model has been previously studied by several groups. Some physicists argued that, as far as the existence of superconductivity in this model is concerned, it is canonically equivalent to the negative-U Hubbard model. However, others did not agree. In the present paper, we shall investigate this model by an independent and rigorous approach. We show that the ground state of the Penson-Kolb model is nondegenerate and has a nonvanishing overlap with the ground state of the negative-U Hubbard model. Furthermore, we also show that the ground states of both the models have the same good quantum numbers and may have superconducting long-range order at the same momentum q = 0. Our results support the equivalence between these models. The project partially supported by the Special Funds for Major State Basic Research Projects (G20000365) and National Natural Science Foundation of China under Grant No. 10174002

  18. Complexities and Controversies in Himalayan Research: A Call for Collaboration and Rigor for Better Data

    Directory of Open Access Journals (Sweden)

    Surendra P. Singh

    2015-11-01

    Full Text Available The Himalaya range encompasses enormous variation in elevation, precipitation, biodiversity, and patterns of human livelihoods. These mountains modify the regional climate in complex ways; the ecosystem services they provide influence the lives of almost 1 billion people in 8 countries. However, our understanding of these ecosystems remains rudimentary. The 2007 Intergovernmental Panel on Climate Change report that erroneously predicted a date for widespread glacier loss exposed how little was known of Himalayan glaciers. Recent research shows how variably glaciers respond to climate change in different Himalayan regions. Alarmist theories are not new. In the 1980s, the Theory of Himalayan Degradation warned of complete forest loss and devastation of downstream areas, an eventuality that never occurred. More recently, the debate on hydroelectric construction appears driven by passions rather than science. Poor data, hasty conclusions, and bad science plague Himalayan research. Rigorous sampling, involvement of civil society in data collection, and long-term collaborative research involving institutions from across the Himalaya are essential to improve knowledge of this region.

  19. Rigorous Mathematical Thinking Approach to Enhance Students’ Mathematical Creative and Critical Thinking Abilities

    Science.gov (United States)

    Hidayat, D.; Nurlaelah, E.; Dahlan, J. A.

    2017-09-01

    The ability of mathematical creative and critical thinking are two abilities that need to be developed in the learning of mathematics. Therefore, efforts need to be made in the design of learning that is capable of developing both capabilities. The purpose of this research is to examine the mathematical creative and critical thinking ability of students who get rigorous mathematical thinking (RMT) approach and students who get expository approach. This research was quasi experiment with control group pretest-posttest design. The population were all of students grade 11th in one of the senior high school in Bandung. The result showed that: the achievement of mathematical creative and critical thinking abilities of student who obtain RMT is better than students who obtain expository approach. The use of Psychological tools and mediation with criteria of intentionality, reciprocity, and mediated of meaning on RMT helps students in developing condition in critical and creative processes. This achievement contributes to the development of integrated learning design on students’ critical and creative thinking processes.

  20. A Rigorous Theory of Many-Body Prethermalization for Periodically Driven and Closed Quantum Systems

    Science.gov (United States)

    Abanin, Dmitry; De Roeck, Wojciech; Ho, Wen Wei; Huveneers, François

    2017-09-01

    Prethermalization refers to the transient phenomenon where a system thermalizes according to a Hamiltonian that is not the generator of its evolution. We provide here a rigorous framework for quantum spin systems where prethermalization is exhibited for very long times. First, we consider quantum spin systems under periodic driving at high frequency {ν}. We prove that up to a quasi-exponential time {τ_* ˜ e^{c ν/log^3 ν}}, the system barely absorbs energy. Instead, there is an effective local Hamiltonian {\\widehat D} that governs the time evolution up to {τ_*}, and hence this effective Hamiltonian is a conserved quantity up to {τ_*}. Next, we consider systems without driving, but with a separation of energy scales in the Hamiltonian. A prime example is the Fermi-Hubbard model where the interaction U is much larger than the hopping J. Also here we prove the emergence of an effective conserved quantity, different from the Hamiltonian, up to a time {τ_*} that is (almost) exponential in {U/J}.

  1. Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death

    Directory of Open Access Journals (Sweden)

    Evgeniy R. Galimov

    2018-03-01

    Full Text Available Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence. Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC. This phenomenon is accompanied by a wave of intramuscular Ca2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death.

  2. Inosine-5'-monophosphate is a candidate agent to resolve rigor mortis of skeletal muscle.

    Science.gov (United States)

    Matsuishi, Masanori; Tsuji, Mariko; Yamaguchi, Megumi; Kitamura, Natsumi; Tanaka, Sachi; Nakamura, Yukinobu; Okitani, Akihiro

    2016-11-01

    The object of the present study was to reveal the action of inosine-5'-monophosphate (IMP) toward myofibrils in postmortem muscles. IMP solubilized isolated actomyosin within a narrow range of KCl concentration, 0.19-0.20 mol/L, because of the dissociation of actomyosin into actin and myosin, but it did not solubilize the proteins in myofibrils with 0.2 mol/L KCl. However, IMP could solubilize both proteins in myofibrils with 0.2 mol/L KCl in the presence of 1 m mol/L pyrophosphate or 1.0-3.3 m mol/L adenosine-5'-diphosphate (ADP). Thus, we presumed that pyrophosphate and ADP released thin filaments composed of actin, and thick filaments composed of myosin from restraints of myofibrils, and then both filaments were solubilized through the IMP-induced dissociation of actomyosin. Thus, we concluded that IMP is a candidate agent to resolve rigor mortis because of its ability to break the association between thick and thin filaments. © 2016 Japanese Society of Animal Science.

  3. Alternative pre-rigor foreshank positioning can improve beef shoulder muscle tenderness.

    Science.gov (United States)

    Grayson, A L; Lawrence, T E

    2013-09-01

    Thirty beef carcasses were harvested and the foreshank of each side was independently positioned (cranial, natural, parallel, or caudal) 1h post-mortem to determine the effect of foreshank angle at rigor mortis on the sarcomere length and tenderness of six beef shoulder muscles. The infraspinatus (IS), pectoralis profundus (PP), serratus ventralis (SV), supraspinatus (SS), teres major (TM) and triceps brachii (TB) were excised 48 h post-mortem for Warner-Bratzler shear force (WBSF) and sarcomere length evaluations. All muscles except the SS had altered (P<0.05) sarcomere lengths between positions; the cranial position resulted in the longest sarcomeres for the SV and TB muscles whilst the natural position had longer sarcomeres for the PP and TM muscles. The SV from the cranial position had lower (P<0.05) shear than the caudal position and TB from the natural position had lower (P<0.05) shear than the parallel or caudal positions. Sarcomere length was moderately correlated (r=-0.63; P<0.01) to shear force. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death.

    Science.gov (United States)

    Galimov, Evgeniy R; Pryor, Rosina E; Poole, Sarah E; Benedetto, Alexandre; Pincus, Zachary; Gems, David

    2018-03-06

    Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence). Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC). This phenomenon is accompanied by a wave of intramuscular Ca 2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca 2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death. VIDEO ABSTRACT. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  5. Rigor mortis and the epileptology of Charles Bland Radcliffe (1822-1889).

    Science.gov (United States)

    Eadie, M J

    2007-03-01

    Charles Bland Radcliffe (1822-1889) was one of the physicians who made major contributions to the literature on epilepsy in the mid-19th century, when the modern understanding of the disorder was beginning to emerge, particularly in England. His experimental work was concerned with the electrical properties of frog muscle and nerve. Early in his career he related his experimental findings to the phenomenon of rigor mortis and concluded that, contrary to the general belief of the time, muscle contraction depended on the cessation of nerve input, and muscle relaxation on its presence. He adhered to this counter-intuitive interpretation throughout his life and, based on it, produced an epileptology that was very different from those of his contemporaries and successors. His interpretations were ultimately without any direct influence on the advance of knowledge. However, his idea that withdrawal of an inhibitory process released previously suppressed muscular contractile powers, when applied to the brain rather than the periphery of the nervous system, permitted Hughlings Jackson to explain certain psychological phenomena that accompany or follow some epileptic events. As well, Radcliffe was one of the chief early advocates for potassium bromide, the first effective anticonvulsant.

  6. Improving students’ mathematical critical thinking through rigorous teaching and learning model with informal argument

    Science.gov (United States)

    Hamid, H.

    2018-01-01

    The purpose of this study is to analyze an improvement of students’ mathematical critical thinking (CT) ability in Real Analysis course by using Rigorous Teaching and Learning (RTL) model with informal argument. In addition, this research also attempted to understand students’ CT on their initial mathematical ability (IMA). This study was conducted at a private university in academic year 2015/2016. The study employed the quasi-experimental method with pretest-posttest control group design. The participants of the study were 83 students in which 43 students were in the experimental group and 40 students were in the control group. The finding of the study showed that students in experimental group outperformed students in control group on mathematical CT ability based on their IMA (high, medium, low) in learning Real Analysis. In addition, based on medium IMA the improvement of mathematical CT ability of students who were exposed to RTL model with informal argument was greater than that of students who were exposed to CI (conventional instruction). There was also no effect of interaction between RTL model and CI model with both (high, medium, and low) IMA increased mathematical CT ability. Finally, based on (high, medium, and low) IMA there was a significant improvement in the achievement of all indicators of mathematical CT ability of students who were exposed to RTL model with informal argument than that of students who were exposed to CI.

  7. Control group design: enhancing rigor in research of mind-body therapies for depression.

    Science.gov (United States)

    Kinser, Patricia Anne; Robins, Jo Lynne

    2013-01-01

    Although a growing body of research suggests that mind-body therapies may be appropriate to integrate into the treatment of depression, studies consistently lack methodological sophistication particularly in the area of control groups. In order to better understand the relationship between control group selection and methodological rigor, we provide a brief review of the literature on control group design in yoga and tai chi studies for depression, and we discuss challenges we have faced in the design of control groups for our recent clinical trials of these mind-body complementary therapies for women with depression. To address the multiple challenges of research about mind-body therapies, we suggest that researchers should consider 4 key questions: whether the study design matches the research question; whether the control group addresses performance, expectation, and detection bias; whether the control group is ethical, feasible, and attractive; and whether the control group is designed to adequately control for nonspecific intervention effects. Based on these questions, we provide specific recommendations about control group design with the goal of minimizing bias and maximizing validity in future research.

  8. Methodological Challenges in Sustainability Science: A Call for Method Plurality, Procedural Rigor and Longitudinal Research

    Directory of Open Access Journals (Sweden)

    Henrik von Wehrden

    2017-02-01

    Full Text Available Sustainability science encompasses a unique field that is defined through its purpose, the problem it addresses, and its solution-oriented agenda. However, this orientation creates significant methodological challenges. In this discussion paper, we conceptualize sustainability problems as wicked problems to tease out the key challenges that sustainability science is facing if scientists intend to deliver on its solution-oriented agenda. Building on the available literature, we discuss three aspects that demand increased attention for advancing sustainability science: 1 methods with higher diversity and complementarity are needed to increase the chance of deriving solutions to the unique aspects of wicked problems; for instance, mixed methods approaches are potentially better suited to allow for an approximation of solutions, since they cover wider arrays of knowledge; 2 methodologies capable of dealing with wicked problems demand strict procedural and ethical guidelines, in order to ensure their integration potential; for example, learning from solution implementation in different contexts requires increased comparability between research approaches while carefully addressing issues of legitimacy and credibility; and 3 approaches are needed that allow for longitudinal research, since wicked problems are continuous and solutions can only be diagnosed in retrospect; for example, complex dynamics of wicked problems play out across temporal patterns that are not necessarily aligned with the common timeframe of participatory sustainability research. Taken together, we call for plurality in methodologies, emphasizing procedural rigor and the necessity of continuous research to effectively addressing wicked problems as well as methodological challenges in sustainability science.

  9. A TRADITIONAL FALSE PROBLEM: THE RIGORISM OF KANTIAN MORAL AND POLITICAL PHILOSOPHY. THE CASE OF VERACITY

    Directory of Open Access Journals (Sweden)

    MIHAI NOVAC

    2012-05-01

    Full Text Available According to many of its traditional critics, the main weakness in Kantian moral-political philosophy resides in its impossibility of admitting exceptions. In nuce, all these critical positions have converged, despite their reciprocal heterogeneity, in the so called accuse of moral rigorism (unjustly, I would say directed against Kant’s moral and political perspective. As such, basically, I will seek to defend Kant against this type of criticism, by showing that any perspective attempting to evaluate Kant’s ethics on the grounds of its capacity or incapacity to admit exceptions is apriorily doomed to lack of sense, in its two logical alternatives, i.e. either as nonsense (predicating about empty notions, or as tautology (formulating ad hoc definitions and criteria with respect to Kant’s system and then claiming that it does not hold with respect to them. Essentially, I will try to show that Kantian ethics can organically immunize itself epistemologically against any such so called antirigorist criticism.

  10. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  11. Rigorous numerical study of strong microwave photon-magnon coupling in all-dielectric magnetic multilayers

    Energy Technology Data Exchange (ETDEWEB)

    Maksymov, Ivan S., E-mail: ivan.maksymov@uwa.edu.au [School of Physics M013, The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia); ARC Centre of Excellence for Nanoscale BioPhotonics, School of Applied Sciences, RMIT University, Melbourne, VIC 3001 (Australia); Hutomo, Jessica; Nam, Donghee; Kostylev, Mikhail [School of Physics M013, The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia)

    2015-05-21

    We demonstrate theoretically a ∼350-fold local enhancement of the intensity of the in-plane microwave magnetic field in multilayered structures made from a magneto-insulating yttrium iron garnet (YIG) layer sandwiched between two non-magnetic layers with a high dielectric constant matching that of YIG. The enhancement is predicted for the excitation regime when the microwave magnetic field is induced inside the multilayer by the transducer of a stripline Broadband Ferromagnetic Resonance (BFMR) setup. By means of a rigorous numerical solution of the Landau-Lifshitz-Gilbert equation consistently with the Maxwell's equations, we investigate the magnetisation dynamics in the multilayer. We reveal a strong photon-magnon coupling, which manifests itself as anti-crossing of the ferromagnetic resonance magnon mode supported by the YIG layer and the electromagnetic resonance mode supported by the whole multilayered structure. The frequency of the magnon mode depends on the external static magnetic field, which in our case is applied tangentially to the multilayer in the direction perpendicular to the microwave magnetic field induced by the stripline of the BFMR setup. The frequency of the electromagnetic mode is independent of the static magnetic field. Consequently, the predicted photon-magnon coupling is sensitive to the applied magnetic field and thus can be used in magnetically tuneable metamaterials based on simultaneously negative permittivity and permeability achievable thanks to the YIG layer. We also suggest that the predicted photon-magnon coupling may find applications in microwave quantum information systems.

  12. Rigorous numerical modeling of scattering-type scanning near-field optical microscopy and spectroscopy

    Science.gov (United States)

    Chen, Xinzhong; Lo, Chiu Fan Bowen; Zheng, William; Hu, Hai; Dai, Qing; Liu, Mengkun

    2017-11-01

    Over the last decade, scattering-type scanning near-field optical microscopy and spectroscopy have been widely used in nano-photonics and material research due to their fine spatial resolution and broad spectral range. A number of simplified analytical models have been proposed to quantitatively understand the tip-scattered near-field signal. However, a rigorous interpretation of the experimental results is still lacking at this stage. Numerical modelings, on the other hand, are mostly done by simulating the local electric field slightly above the sample surface, which only qualitatively represents the near-field signal rendered by the tip-sample interaction. In this work, we performed a more comprehensive numerical simulation which is based on realistic experimental parameters and signal extraction procedures. By directly comparing to the experiments as well as other simulation efforts, our methods offer a more accurate quantitative description of the near-field signal, paving the way for future studies of complex systems at the nanoscale.

  13. Rigorous constraints on the matrix elements of the energy–momentum tensor

    Directory of Open Access Journals (Sweden)

    Peter Lowdon

    2017-11-01

    Full Text Available The structure of the matrix elements of the energy–momentum tensor play an important role in determining the properties of the form factors A(q2, B(q2 and C(q2 which appear in the Lorentz covariant decomposition of the matrix elements. In this paper we apply a rigorous frame-independent distributional-matching approach to the matrix elements of the Poincaré generators in order to derive constraints on these form factors as q→0. In contrast to the literature, we explicitly demonstrate that the vanishing of the anomalous gravitomagnetic moment B(0 and the condition A(0=1 are independent of one another, and that these constraints are not related to the specific properties or conservation of the individual Poincaré generators themselves, but are in fact a consequence of the physical on-shell requirement of the states in the matrix elements and the manner in which these states transform under Poincaré transformations.

  14. Rigorous, robust and systematic: Qualitative research and its contribution to burn care. An integrative review.

    Science.gov (United States)

    Kornhaber, Rachel Anne; de Jong, A E E; McLean, L

    2015-12-01

    Qualitative methods are progressively being implemented by researchers for exploration within healthcare. However, there has been a longstanding and wide-ranging debate concerning the relative merits of qualitative research within the health care literature. This integrative review aimed to exam the contribution of qualitative research in burns care and subsequent rehabilitation. Studies were identified using an electronic search strategy using the databases PubMed, Cumulative Index of Nursing and Allied Health Literature (CINAHL), Excerpta Medica database (EMBASE) and Scopus of peer reviewed primary research in English between 2009 to April 2014 using Whittemore and Knafl's integrative review method as a guide for analysis. From the 298 papers identified, 26 research papers met the inclusion criteria. Across all studies there was an average of 22 participants involved in each study with a range of 6-53 participants conducted across 12 nations that focussed on burns prevention, paediatric burns, appropriate acquisition and delivery of burns care, pain and psychosocial implications of burns trauma. Careful and rigorous application of qualitative methodologies promotes and enriches the development of burns knowledge. In particular, the key elements in qualitative methodological process and its publication are critical in disseminating credible and methodologically sound qualitative research. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  15. Rigorous analysis of image force barrier lowering in bounded geometries: application to semiconducting nanowires

    International Nuclear Information System (INIS)

    Calahorra, Yonatan; Mendels, Dan; Epstein, Ariel

    2014-01-01

    Bounded geometries introduce a fundamental problem in calculating the image force barrier lowering of metal-wrapped semiconductor systems. In bounded geometries, the derivation of the barrier lowering requires calculating the reference energy of the system, when the charge is at the geometry center. In the following, we formulate and rigorously solve this problem; this allows combining the image force electrostatic potential with the band diagram of the bounded geometry. The suggested approach is applied to spheres as well as cylinders. Furthermore, although the expressions governing cylindrical systems are complex and can only be evaluated numerically, we present analytical approximations for the solution, which allow easy implementation in calculated band diagrams. The results are further used to calculate the image force barrier lowering of metal-wrapped cylindrical nanowires; calculations show that although the image force potential is stronger than that of planar systems, taking the complete band-structure into account results in a weaker effect of barrier lowering. Moreover, when considering small diameter nanowires, we find that the electrostatic effects of the image force exceed the barrier region, and influence the electronic properties of the nanowire core. This study is of interest to the nanowire community, and in particular for the analysis of nanowire I−V measurements where wrapped or omega-shaped metallic contacts are used. (paper)

  16. Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows

    Science.gov (United States)

    Qi, Di; Majda, Andrew J.

    2018-04-01

    Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.

  17. Diffraction-based overlay measurement on dedicated mark using rigorous modeling method

    Science.gov (United States)

    Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang

    2012-03-01

    Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.

  18. Estimation of the convergence order of rigorous coupled-wave analysis for OCD metrology

    Science.gov (United States)

    Ma, Yuan; Liu, Shiyuan; Chen, Xiuguo; Zhang, Chuanwei

    2011-12-01

    In most cases of optical critical dimension (OCD) metrology, when applying rigorous coupled-wave analysis (RCWA) to optical modeling, a high order of Fourier harmonics is usually set up to guarantee the convergence of the final results. However, the total number of floating point operations grows dramatically as the truncation order increases. Therefore, it is critical to choose an appropriate order to obtain high computational efficiency without losing much accuracy in the meantime. In this paper, the convergence order associated with the structural and optical parameters has been estimated through simulation. The results indicate that the convergence order is linear with the period of the sample when fixing the other parameters, both for planar diffraction and conical diffraction. The illuminated wavelength also affects the convergence of a final result. With further investigations concentrated on the ratio of illuminated wavelength to period, it is discovered that the convergence order decreases with the growth of the ratio, and when the ratio is fixed, convergence order jumps slightly, especially in a specific range of wavelength. This characteristic could be applied to estimate the optimum convergence order of given samples to obtain high computational efficiency.

  19. Conformational distributions and proximity relationships in the rigor complex of actin and myosin subfragment-1.

    Science.gov (United States)

    Nyitrai, M; Hild, G; Lukács, A; Bódis, E; Somogyi, B

    2000-01-28

    Cyclic conformational changes in the myosin head are considered essential for muscle contraction. We hereby show that the extension of the fluorescence resonance energy transfer method described originally by Taylor et al. (Taylor, D. L., Reidler, J., Spudich, J. A., and Stryer, L. (1981) J. Cell Biol. 89, 362-367) allows determination of the position of a labeled point outside the actin filament in supramolecular complexes and also characterization of the conformational heterogeneity of an actin-binding protein while considering donor-acceptor distance distributions. Using this method we analyzed proximity relationships between two labeled points of S1 and the actin filament in the acto-S1 rigor complex. The donor (N-[[(iodoacetyl)amino]ethyl]-5-naphthylamine-1-sulfonate) was attached to either the catalytic domain (Cys-707) or the essential light chain (Cys-177) of S1, whereas the acceptor (5-(iodoacetamido)fluorescein) was attached to the actin filament (Cys-374). In contrast to the narrow positional distribution (assumed as being Gaussian) of Cys-707 (5 +/- 3 A), the positional distribution of Cys-177 was found to be broad (102 +/- 4 A). Such a broad positional distribution of the label on the essential light chain of S1 may be important in accommodating the helically arranged acto-myosin binding relative to the filament axis.

  20. Is writing style predictive of scientific fraud?

    DEFF Research Database (Denmark)

    Braud, Chloé Elodie; Søgaard, Anders

    2017-01-01

    The problem of detecting scientific fraud using machine learning was recently introduced, with initial, positive results from a model taking into account various general indicators. The results seem to suggest that writing style is predictive of scientific fraud. We revisit these initial experime......The problem of detecting scientific fraud using machine learning was recently introduced, with initial, positive results from a model taking into account various general indicators. The results seem to suggest that writing style is predictive of scientific fraud. We revisit these initial...... experiments, and show that the leave-one-out testing procedure they used likely leads to a slight over-estimate of the predictability, but also that simple models can outperform their proposed model by some margin. We go on to explore more abstract linguistic features, such as linguistic complexity...

  1. Time, science and consensus: the different times involving scientific research, political decision and public opinion

    Directory of Open Access Journals (Sweden)

    José Aparecido de

    2010-01-01

    Full Text Available This essay analyses the asymmetrical relationship between the time of scientific research and the time of the different segments interested in their results, focusing mainly on necessity to establish technical consensus about the fields of science that require rigorous investigations and texts. In the last years, civil society sectors - mainly scientific journalism, legislative power, and public opinion - has shown growing interest in participating of the decision making process that regulates science routes. In this study, we analyzed the decision making process of the Biosafety Law, as it allows research with embryonic stem cells in Brazil. The results allow us to conclude that this asymmetrical relationship between the different times (of science, scientific disclosure, public opinion, and public power contribute to the maturing of the dialog on scientific policies, as well as to the establishment of a consensus concerning science routes, which aims at the democratization of scientific work.

  2. Scientific Programming in Fortran

    Directory of Open Access Journals (Sweden)

    W. Van Snyder

    2007-01-01

    Full Text Available The Fortran programming language was designed by John Backus and his colleagues at IBM to reduce the cost of programming scientific applications. IBM delivered the first compiler for its model 704 in 1957. IBM's competitors soon offered incompatible versions. ANSI (ASA at the time developed a standard, largely based on IBM's Fortran IV in 1966. Revisions of the standard were produced in 1977, 1990, 1995 and 2003. Development of a revision, scheduled for 2008, is under way. Unlike most other programming languages, Fortran is periodically revised to keep pace with developments in language and processor design, while revisions largely preserve compatibility with previous versions. Throughout, the focus on scientific programming, and especially on efficient generated programs, has been maintained.

  3. 1997 Scientific Report

    International Nuclear Information System (INIS)

    Govaerts, P.

    1998-01-01

    The 1997 Scientific Report of the Belgian Nuclear Research Centre SCK-CEN describes progress achieved in nuclear safety, radioactive waste management, radiation protection and safeguards. In the field of nuclear research, the main projects concern the behaviour of high-burnup and MOX fuel, the embrittlement of reactor pressure vessels, the irradiation-assisted stress corrosion cracking of reactor internals, and irradiation effects on materials of fusion reactors. In the field of radioactive waste management, progress in the following domains is reported: the disposal of high-level radioactive waste and spent fuel in a clay formation, the decommissioning of nuclear installations, the study of alternative waste-processing techniques. For radiation protection and safeguards, the main activities reported on are in the field of site and environmental restoration, emergency planning and response and scientific support to national and international programmes

  4. Scientific report 1999

    International Nuclear Information System (INIS)

    1999-01-01

    The aim of this report is to outline the main developments of the 'Departement des Reacteurs Nucleaires' (DRN) during the year 1999. DRN is one of the CEA Institutions. This report is divided in three main parts: the DRN scientific programs, the scientific and technical publications (with abstracts in English) and economic data on staff, budget and communication. Main results of the Department for the year 1999 are presented giving information on the simulation of low mach number compressible flow, experimental irradiation of multi-materials, progress in the dry route conversion process of UF 6 to UO 2 , the neutronics, the CASCADE installation, the corium, the BWR type reactor cores technology, the reactor safety, the transmutation of americium and fuel cell flow studies, the crack propagation, the hybrid systems and the CEA sites improvement. (A.L.B.)

  5. Scientific publications in Nepal.

    Science.gov (United States)

    Magar, A

    2012-09-01

    Scientific publications have become a mainstay of communication among readers, academicians, researchers and scientists worldwide. Although, its existence dates back to 17 th century in the West, Nepal is still struggling to take few steps towards improving its local science for last 50 years. Since the start of the first medical journal in 1963, the challenges remains as it were decades back regarding role of authors, peer reviewers, editors and even publishers in Nepal. Although, there has been some development in terms of the number of articles being published and appearances of the journals, yet there is a long way to go. This article analyzes the past and present scenario, and future perspective for scientific publications in Nepal.

  6. Sherlock Holmes: scientific detective.

    Science.gov (United States)

    Snyder, Laura J

    2004-09-01

    Sherlock Holmes was intended by his creator, Arthur Conan Doyle, to be a 'scientific detective'. Conan Doyle criticized his predecessor Edgar Allan Poe for giving his creation - Inspector Dupin - only the 'illusion' of scientific method. Conan Doyle believed that he had succeeded where Poe had failed; thus, he has Watson remark that Holmes has 'brought detection as near an exact science as it will ever be brought into the world.' By examining Holmes' methods, it becomes clear that Conan Doyle modelled them on certain images of science that were popular in mid- to late-19th century Britain. Contrary to a common view, it is also evident that rather than being responsible for the invention of forensic science, the creation of Holmes was influenced by the early development of it.

  7. Collaboration in scientific practice

    DEFF Research Database (Denmark)

    Wagenknecht, Susann

    2014-01-01

    This monograph investigates the collaborative creation of scientific knowledge in research groups. To do so, I combine philosophical analysis with a first-hand comparative case study of two research groups in experimental science. Qualitative data are gained through observation and interviews......, and I combine empirical insights with existing approaches to knowledge creation in philosophy of science and social epistemology. On the basis of my empirically-grounded analysis I make several conceptual contributions. I study scientific collaboration as the interaction of scientists within research...... to their publication. Specifically, I suggest epistemic difference and the porosity of social structure as two conceptual leitmotifs in the study of group collaboration. With epistemic difference, I emphasize the value of socio-cognitive heterogeneity in group collaboration. With porosity, I underline the fact...

  8. Scientific report 1998

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of this report is to outline the main developments of the ''Departement des Reacteurs Nucleaires'', (DRN) during the year 1998. DRN is one of the CEA Institution. This report is divided in three main parts: the DRN scientific programs, the scientific and technical publications (with abstracts in english) and economic data on staff, budget and communication. Main results of the Department, for the year 1998, are presented giving information on the reactors technology and safety, the neutronics, the transmutation and the hybrid systems, the dismantling and the sites improvement, the nuclear accidents, the nuclear matter transport, the thermonuclear fusion safety, the fuel cladding materials and radioactive waste control. (A.L.B.)

  9. Scientific Resource EXplorer

    Science.gov (United States)

    Xing, Z.; Wormuth, A.; Smith, A.; Arca, J.; Lu, Y.; Sayfi, E.

    2014-12-01

    Inquisitive minds in our society are never satisfied with curatedimages released by a typical public affairs office. They always want tolook deeper and play directly on original data. However, most scientificdata products are notoriously hard to use. They are immensely large,highly distributed and diverse in format. In this presentation,we will demonstrate Resource EXplorer (REX), a novel webtop applicationthat allows anyone to conveniently explore and visualize rich scientificdata repositories, using only a standard web browser. This tool leverageson the power of Webification Science (w10n-sci), a powerful enabling technologythat simplifies the use of scientific data on the web platform.W10n-sci is now being deployed at an increasing number of NASA data centers,some of which are the largest digital treasure troves in our nation.With REX, these wonderful scientific resources are open for teachers andstudents to learn and play.

  10. Professional scientific blog

    Directory of Open Access Journals (Sweden)

    Tamás Beke

    2009-03-01

    Full Text Available The professional blog is a weblog that on the whole meets the requirements of scientific publication. In my opinion it bear a resemblance to digital notice board, where the competent specialists of the given branch of science can place their ideas, questions, possible solutions and can raise problems. Its most important function can be collectivization of the knowledge. In this article I am going to examine the characteristics of the scientific blog as a genre. Conventional learning counts as a rather solitary activity. If the students have access to the materials of each other and of the teacher, their sense of solitude diminishes and this model is also closer to the constructivist approach that features the way most people think and learn. Learning does not mean passively collecting tiny pieces of knowledge; it much more esembles ‘spinning a conceptual net’ which is made up by the experiences and observations of the individual. With the spreading of the Internet more universities and colleges worldwide gave a try to on-line educational methods, but the most efficient one has not been found yet. The publication of the curriculum (the material of the lectures and the handling of the electronic mails are not sufficient; much more is needed for collaborative learning. Our scholastic scientific blog can be a sufficient field for the start of a knowledge-building process based on cooperation. In the Rocard-report can be read that for the future of Europe it is crucial to develop the education of the natural sciences, and for this it isnecessary to act on local, regional, national and EU-level. To the educational processes should be involved beyond the traditional actors (child, parent, teacher also others (scientists, professionals, universities, local institutions, the actors of the economic sphere, etc.. The scholastic scientific blog answer the purposes, as a collaborative knowledge-sharing forum.

  11. Scientific Technological Report 2002

    International Nuclear Information System (INIS)

    Gayoso C, C.; Cuya G, T.; Robles N, A.; Prado C, A.

    2003-07-01

    This annual scientific-technological report provides an overview of research and development activities at Peruvian Institute of Nuclear Energy (IPEN) during the period from 1 january to 31 december, 2002. This report includes 58 papers divided in 10 subject matters: physics and nuclear chemistry, nuclear engineering, materials, industrial applications, biological applications, medical applications, environmental applications, protection and radiological safety, nuclear safety, and management aspects

  12. Evaluating a scientific collaboratory

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.; Whitton, Mary C.; Maglaughlin, Kelly L.

    2003-01-01

    of the system, and post-interviews to understand the participants' views of doing science under both conditions. We hypothesized that study participants would be less effective, report more difficulty, and be less favorably inclined to adopt the system when collaborating remotely. Contrary to expectations...... of collaborating remotely. While the data analysis produced null results, considered as a whole, the analysis leads us to conclude there is positive potential for the development and adoption of scientific collaboratory systems....

  13. National nuclear scientific program

    International Nuclear Information System (INIS)

    Plecas, I.; Matausek, M.V.; Neskovic, N.

    2001-01-01

    National scientific program of the Vinca Institute Nuclear Reactors And Radioactive Waste comprises research and development in the following fields: application of energy of nuclear fission, application of neutron beams, analyses of nuclear safety and radiation protection. In the first phase preparatory activities, conceptual design and design of certain processes and facilities should be accomplished. In the second phase realization of the projects is expected. (author)

  14. PROSCENIUM OF SCIENTIFIC MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Vasile Berlingher

    2013-09-01

    Full Text Available During the last three decades of the nineteenth century, organizations developed rapidly, their managers began to realize that they had too frequent managerial problems; this awareness lead to a new phase of development of scientific management. Examining the titles published in that period, it can be concluded that management issues that pose interest related to payroll and payroll systems, problems exacerbated by the industrial revolution and related work efficiency. Noting that large organizations losing power, direct supervision, the managers were looking for incentives to replace this power . One of the first practitioners of this new management system was Henry R. Towne, the president of the well-known enterprise "Yale and Towne Manufacturing Company", which applied the management methods in his company workshops. Publishers of magazines "Industrial Management" and "The Engineering Magazine" stated that HR Towne is, undisputedly, the pioneer of scientific management. He initiated the systematic application of effective management methods and his famous article "The Engineer as Economist" provided to the company. "American Society of Mechanical Engineers" in 1886 was the one that probably inspired Frederick W. Taylor to devote his entire life and work in scientific management.

  15. The next scientific revolution.

    Science.gov (United States)

    Hey, Tony

    2010-11-01

    For decades, computer scientists have tried to teach computers to think like human experts. Until recently, most of those efforts have failed to come close to generating the creative insights and solutions that seem to come naturally to the best researchers, doctors, and engineers. But now, Tony Hey, a VP of Microsoft Research, says we're witnessing the dawn of a new generation of powerful computer tools that can "mash up" vast quantities of data from many sources, analyze them, and help produce revolutionary scientific discoveries. Hey and his colleagues call this new method of scientific exploration "machine learning." At Microsoft, a team has already used it to innovate a method of predicting with impressive accuracy whether a patient with congestive heart failure who is released from the hospital will be readmitted within 30 days. It was developed by directing a computer program to pore through hundreds of thousands of data points on 300,000 patients and "learn" the profiles of patients most likely to be rehospitalized. The economic impact of this prediction tool could be huge: If a hospital understands the likelihood that a patient will "bounce back," it can design programs to keep him stable and save thousands of dollars in health care costs. Similar efforts to uncover important correlations that could lead to scientific breakthroughs are under way in oceanography, conservation, and AIDS research. And in business, deep data exploration has the potential to unearth critical insights about customers, supply chains, advertising effectiveness, and more.

  16. The paradox of scientific expertise

    DEFF Research Database (Denmark)

    Alrøe, Hugo Fjelsted; Noe, Egon

    2011-01-01

    Modern societies depend on a growing production of scientific knowledge, which is based on the functional differentiation of science into still more specialised scientific disciplines and subdisciplines. This is the basis for the paradox of scientific expertise: The growth of science leads to a f...... cross-disciplinary research and in the collective use of different kinds of scientific expertise, and thereby make society better able to solve complex, real-world problems.......Modern societies depend on a growing production of scientific knowledge, which is based on the functional differentiation of science into still more specialised scientific disciplines and subdisciplines. This is the basis for the paradox of scientific expertise: The growth of science leads...... to a fragmentation of scientific expertise. To resolve this paradox, the present paper investigates three hypotheses: 1) All scientific knowledge is perspectival. 2) The perspectival structure of science leads to specific forms of knowledge asymmetries. 3) Such perspectival knowledge asymmetries must be handled...

  17. Robust Trypsin Coating on Electrospun Polymer Nanofibers in Rigorous Conditions and Its Uses for Protein Digestion

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Hye-Kyung; Kim, Byoung Chan; Jun, Seung-Hyun; Chang, Mun Seock; Lopez-Ferrer, Daniel; Smith, Richard D.; Gu, Man Bock; Lee, Sang-Won; Kim, Beom S.; Kim, Jungbae

    2010-12-15

    An efficient protein digestion in proteomic analysis requires the stabilization of proteases such as trypsin. In the present work, trypsin was stabilized in the form of enzyme coating on electrospun polymer nanofibers (EC-TR), which crosslinks additional trypsin molecules onto covalently-attached trypsin (CA-TR). EC-TR showed better stability than CA-TR in rigorous conditions, such as at high temperatures of 40 °C and 50 °C, in the presence of organic co-solvents, and at various pH's. For example, the half-lives of CA-TR and EC-TR were 0.24 and 163.20 hours at 40 ºC, respectively. The improved stability of EC-TR can be explained by covalent-linkages on the surface of trypsin molecules, which effectively inhibits the denaturation, autolysis, and leaching of trypsin. The protein digestion was performed at 40 °C by using both CA-TR and EC-TR in digesting a model protein, enolase. EC-TR showed better performance and stability than CA-TR by maintaining good performance of enolase digestion under recycled uses for a period of one week. In the same condition, CA-TR showed poor performance from the beginning, and could not be used for digestion at all after a few usages. The enzyme coating approach is anticipated to be successfully employed not only for protein digestion in proteomic analysis, but also for various other fields where the poor enzyme stability presently hampers the practical applications of enzymes.

  18. Rigorous construction and Hadamard property of the Unruh state in Schwarzschild spacetime

    International Nuclear Information System (INIS)

    Dappiaggi, Claudio; Pinamonti, Nicola

    2009-07-01

    The discovery of the radiation properties of black holes prompted the search for a natural candidate quantum ground state for a massless scalar field theory on Schwarzschild spacetime, here considered in the Eddington-Finkelstein representation. Among the several available proposals in the literature, an important physical role is played by the so-called Unruh state which is supposed to be appropriate to capture the physics of a black hole formed by spherically symmetric collapsing matter. Within this respect, we shall consider a massless Klein-Gordon field and we shall rigorously and globally construct such state, that is on the algebra of Weyl observables localised in the union of the static external region, the future event horizon and the non-static black hole region. Eventually, out of a careful use of microlocal techniques, we prove that the built state fulfils, where defined, the so-called Hadamard condition; hence, it is perturbatively stable, in other words realizing the natural candidate with which one could study purely quantum phenomena such as the role of the back reaction of Hawking's radiation. From a geometrical point of view, we shall make a profitable use of a bulk-to-boundary reconstruction technique which carefully exploits the Killing horizon structure as well as the conformal asymptotic behaviour of the underlying background. From an analytical point of view, our tools will range from Hoermander's theorem on propagation of singularities, results on the role of passive states, and a detailed use of the recently discovered peeling behaviour of the solutions of the wave equation in Schwarzschild spacetime. (orig.)

  19. Rigor mortis at the myocardium investigated by post-mortem magnetic resonance imaging.

    Science.gov (United States)

    Bonzon, Jérôme; Schön, Corinna A; Schwendener, Nicole; Zech, Wolf-Dieter; Kara, Levent; Persson, Anders; Jackowski, Christian

    2015-12-01

    Post-mortem cardiac MR exams present with different contraction appearances of the left ventricle in cardiac short axis images. It was hypothesized that the grade of post-mortem contraction may be related to the post-mortem interval (PMI) or cause of death and a phenomenon caused by internal rigor mortis that may give further insights in the circumstances of death. The cardiac contraction grade was investigated in 71 post-mortem cardiac MR exams (mean age at death 52 y, range 12-89 y; 48 males, 23 females). In cardiac short axis images the left ventricular lumen volume as well as the left ventricular myocardial volume were assessed by manual segmentation. The quotient of both (LVQ) represents the grade of myocardial contraction. LVQ was correlated to the PMI, sex, age, cardiac weight, body mass and height, cause of death and pericardial tamponade when present. In cardiac causes of death a separate correlation was investigated for acute myocardial infarction cases and arrhythmic deaths. LVQ values ranged from 1.99 (maximum dilatation) to 42.91 (maximum contraction) with a mean of 15.13. LVQ decreased slightly with increasing PMI, however without significant correlation. Pericardial tamponade positively correlated with higher LVQ values. Variables such as sex, age, body mass and height, cardiac weight and cause of death did not correlate with LVQ values. There was no difference in LVQ values for myocardial infarction without tamponade and arrhythmic deaths. Based on the observation in our investigated cases, the phenomenon of post-mortem myocardial contraction cannot be explained by the influence of the investigated variables, except for pericardial tamponade cases. Further research addressing post-mortem myocardial contraction has to focus on other, less obvious factors, which may influence the early post-mortem phase too. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. A Generic Model for Relative Adjustment Between Optical Sensors Using Rigorous Orbit Mechanics

    Directory of Open Access Journals (Sweden)

    B. Islam

    2008-06-01

    Full Text Available The classical calibration or space resection is the fundamental task in photogrammetry. The lack of sufficient knowledge of interior and exterior orientation parameters lead to unreliable results in the photogrammetric process. One of the earliest in approaches using in photogrammetry was the plumb line calibration method. This method is suitable to recover the radial and decentering lens distortion coefficients, while the remaining interior(focal length and principal point coordinates and exterior orientation parameters have to be determined by a complimentary method. As the lens distortion remains very less it not considered as the interior orientation parameters, in the present rigorous sensor model. There are several other available methods based on the photogrammetric collinearity equations, which consider the determination of exterior orientation parameters, with no mention to the simultaneous determination of inner orientation parameters. Normal space resection methods solve the problem using control points, whose coordinates are known both in image and object reference systems. The non-linearity of the model and the problems, in point location in digital images and identifying the maximum GPS measured control points are the main drawbacks of the classical approaches. This paper addresses mathematical model based on the fundamental assumption of collineariy of three points of two Along-Track Stereo imagery sensors and independent object point. Assuming this condition it is possible to extract the exterior orientation (EO parameters for a long strip and single image together, without and with using the control points. Moreover, after extracting the EO parameters the accuracy for satellite data products are compared in with using single and with no control points.

  1. Bounding Averages Rigorously Using Semidefinite Programming: Mean Moments of the Lorenz System

    Science.gov (United States)

    Goluskin, David

    2018-04-01

    We describe methods for proving bounds on infinite-time averages in differential dynamical systems. The methods rely on the construction of nonnegative polynomials with certain properties, similarly to the way nonlinear stability can be proved using Lyapunov functions. Nonnegativity is enforced by requiring the polynomials to be sums of squares, a condition which is then formulated as a semidefinite program (SDP) that can be solved computationally. Although such computations are subject to numerical error, we demonstrate two ways to obtain rigorous results: using interval arithmetic to control the error of an approximate SDP solution, and finding exact analytical solutions to relatively small SDPs. Previous formulations are extended to allow for bounds depending analytically on parametric variables. These methods are illustrated using the Lorenz equations, a system with three state variables ( x, y, z) and three parameters (β ,σ ,r). Bounds are reported for infinite-time averages of all eighteen moments x^ly^mz^n up to quartic degree that are symmetric under (x,y)\\mapsto (-x,-y). These bounds apply to all solutions regardless of stability, including chaotic trajectories, periodic orbits, and equilibrium points. The analytical approach yields two novel bounds that are sharp: the mean of z^3 can be no larger than its value of (r-1)^3 at the nonzero equilibria, and the mean of xy^3 must be nonnegative. The interval arithmetic approach is applied at the standard chaotic parameters to bound eleven average moments that all appear to be maximized on the shortest periodic orbit. Our best upper bound on each such average exceeds its value on the maximizing orbit by less than 1%. Many bounds reported here are much tighter than would be possible without computer assistance.

  2. Analysis of specular resonance in dielectric bispheres using rigorous and geometrical-optics theories.

    Science.gov (United States)

    Miyazaki, Hideki T; Miyazaki, Hiroshi; Miyano, Kenjiro

    2003-09-01

    We have recently identified the resonant scattering from dielectric bispheres in the specular direction, which has long been known as the specular resonance, to be a type of rainbow (a caustic) and a general phenomenon for bispheres. We discuss the details of the specular resonance on the basis of systematic calculations. In addition to the rigorous theory, which precisely describes the scattering even in the resonance regime, the ray-tracing method, which gives the scattering in the geometrical-optics limit, is used. Specular resonance is explicitly defined as strong scattering in the direction of the specular reflection from the symmetrical axis of the bisphere whose intensity exceeds that of the scattering from noninteracting bispheres. Then the range of parameters for computing a particular specular resonance is specified. This resonance becomes prominent in a wide range of refractive indices (from 1.2 to 2.2) in a wide range of size parameters (from five to infinity) and for an arbitrarily polarized light incident within an angle of 40 degrees to the symmetrical axis. This particular scattering can stay evident even when the spheres are not in contact or the sizes of the spheres are different. Thus specular resonance is a common and robust phenomenon in dielectric bispheres. Furthermore, we demonstrate that various characteristic features in the scattering from bispheres can be explained successfully by using intuitive and simple representations. Most of the significant scatterings other than the specular resonance are also understandable as caustics in geometrical-optics theory. The specular resonance becomes striking at the smallest size parameter among these caustics because its optical trajectory is composed of only the refractions at the surfaces and has an exceptionally large intensity. However, some characteristics are not accounted for by geometrical optics. In particular, the oscillatory behaviors of their scattering intensity are well described by

  3. Unforgivable Sinners? Epistemological and Psychological Naturalism in Husserl’s Philosophy as a Rigorous Science

    Directory of Open Access Journals (Sweden)

    Andrea Sebastiano Staiti

    2012-01-01

    Full Text Available In this paper I present and assess Husserl's arguments against epistomological and psychological naturalism in his essay Philosophy as a Rigorous Science. I show that his critique is directed against positions that are generally more extreme than most currently debated variants of naturalism. However, Husserl has interesting thoughts to contribute to philosophy today. First, he shows that there is an important connection between naturalism in epistemology (which in his view amounts to the position that the validity of logic can be reduced to the validity natural laws of thinking and naturalism in psychology (which in his view amounts to the position that all psychic occurrences are merely parallel accompaniments of physiological occurrences. Second, he shows that a strong version of epistemological naturalism is self-undermining and fails to translate the cogency of logic in psychological terms. Third, and most importantly for current debates, he attacks Cartesianism as a form of psychological naturalism because of its construal of the psyche as a substance. Against this position, Husserl defends the necessity to formulate new epistemic aims for the investigation of consciousness. He contends that what is most interesting about consciousness is not its empirical fact but its transcendental function of granting cognitive access to all kinds of objects (both empirical and ideal. The study of this function requires a specific method (eidetics that cannot be conflated with empirical methods. I conclude that Husserl's analyses offer much-needed insight into the fabric of consciousness and compelling arguments against unwarranted metaphysical speculations about the relationship between mind and body.

  4. A Development of Advanced Rigorous 2 Step System for the High Resolution Residual Dose Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Hyun; Kim, Jong Woo; Kim, Jea Hyun; Lee, Jae Yong; Shin, Chang Ho [Hanyang Univ., Seoul (Korea, Republic of); Kim, Song Hyun [Kyoto University, Sennan (Japan)

    2016-10-15

    In these days, an activation problem such as residual radiation is one of the important issues. The activated devices and structures can emit the residual radiation. Therefore, the activation should be properly analyzed to make a plan for design, operation, and decontamination of nuclear facilities. For activation calculation, Rigorous 2 Step (R2S) method is introduced as following strategy: (1) the particle transport calculation is performed for an object geometry to get particle spectra and total fluxes; (2) inventories of each cell are calculated by using flux information according to irradiation and decay history; (3) the residual gamma distribution was evaluated by transport code, if needed. This scheme is based on cell calculation of used geometry. In this method, the particle spectra and total fluxes are obtained by mesh tally for activation calculation. It is useful to reduce the effects of gradient flux information. Nevertheless, several limitations are known as follows: Firstly, high relative error of spectra, when lots of meshes were used; secondly, different flux information from spectrum of void in mesh-tally. To calculate high resolution residual dose, several method are developed such as R2Smesh and MCR2S unstructured mesh. The R2Smesh method products better efficiency for obtaining neutron spectra by using fine/coarse mesh. Also, the MCR2S unstructured mesh can effectively separate void spectrum. In this study, the AR2S system was developed to combine the features of those mesh based R2S method. To confirm the AR2S system, the simple activation problem was evaluated and compared with R2S method using same division. Those results have good agreement within 0.83 %. Therefore, it is expected that the AR2S system can properly estimate an activation problem.

  5. Rigorous construction and Hadamard property of the Unruh state in Schwarzschild spacetime

    Energy Technology Data Exchange (ETDEWEB)

    Dappiaggi, Claudio; Pinamonti, Nicola [Hamburg Univ. (Germany). II. Inst. fuer Theoretische Physik; Moretti, Valter [Trento Univ., Povo (Italy). Dipt. di Matematica; Istituto Nazionale di Fisica Nucleare, Povo (Italy); Istituto Nazionale di Alta Matematica ' ' F. Severi' ' , GNFM, Sesto Fiorentino (Italy)

    2009-07-15

    The discovery of the radiation properties of black holes prompted the search for a natural candidate quantum ground state for a massless scalar field theory on Schwarzschild spacetime, here considered in the Eddington-Finkelstein representation. Among the several available proposals in the literature, an important physical role is played by the so-called Unruh state which is supposed to be appropriate to capture the physics of a black hole formed by spherically symmetric collapsing matter. Within this respect, we shall consider a massless Klein-Gordon field and we shall rigorously and globally construct such state, that is on the algebra of Weyl observables localised in the union of the static external region, the future event horizon and the non-static black hole region. Eventually, out of a careful use of microlocal techniques, we prove that the built state fulfils, where defined, the so-called Hadamard condition; hence, it is perturbatively stable, in other words realizing the natural candidate with which one could study purely quantum phenomena such as the role of the back reaction of Hawking's radiation. From a geometrical point of view, we shall make a profitable use of a bulk-to-boundary reconstruction technique which carefully exploits the Killing horizon structure as well as the conformal asymptotic behaviour of the underlying background. From an analytical point of view, our tools will range from Hoermander's theorem on propagation of singularities, results on the role of passive states, and a detailed use of the recently discovered peeling behaviour of the solutions of the wave equation in Schwarzschild spacetime. (orig.)

  6. Atlantic salmon skin and fillet color changes effected by perimortem handling stress, rigor mortis, and ice storage.

    Science.gov (United States)

    Erikson, U; Misimi, E

    2008-03-01

    The changes in skin and fillet color of anesthetized and exhausted Atlantic salmon were determined immediately after killing, during rigor mortis, and after ice storage for 7 d. Skin color (CIE L*, a*, b*, and related values) was determined by a Minolta Chroma Meter. Roche SalmoFan Lineal and Roche Color Card values were determined by a computer vision method and a sensory panel. Before color assessment, the stress levels of the 2 fish groups were characterized in terms of white muscle parameters (pH, rigor mortis, and core temperature). The results showed that perimortem handling stress initially significantly affected several color parameters of skin and fillets. Significant transient fillet color changes also occurred in the prerigor phase and during the development of rigor mortis. Our results suggested that fillet color was affected by postmortem glycolysis (pH drop, particularly in anesthetized fillets), then by onset and development of rigor mortis. The color change patterns during storage were different for the 2 groups of fish. The computer vision method was considered suitable for automated (online) quality control and grading of salmonid fillets according to color.

  7. Post mortem rigor development in the Egyptian goose (Alopochen aegyptiacus) breast muscle (pectoralis): factors which may affect the tenderness.

    Science.gov (United States)

    Geldenhuys, Greta; Muller, Nina; Frylinck, Lorinda; Hoffman, Louwrens C

    2016-01-15

    Baseline research on the toughness of Egyptian goose meat is required. This study therefore investigates the post mortem pH and temperature decline (15 min-4 h 15 min post mortem) in the pectoralis muscle (breast portion) of this gamebird species. It also explores the enzyme activity of the Ca(2+)-dependent protease (calpain system) and the lysosomal cathepsins during the rigor mortis period. No differences were found for any of the variables between genders. The pH decline in the pectoralis muscle occurs quite rapidly (c = -0.806; ultimate pH ∼ 5.86) compared with other species and it is speculated that the high rigor temperature (>20 °C) may contribute to the increased toughness. No calpain I was found in Egyptian goose meat and the µ/m-calpain activity remained constant during the rigor period, while a decrease in calpastatin activity was observed. The cathepsin B, B & L and H activity increased over the rigor period. Further research into the connective tissue content and myofibrillar breakdown during aging is required in order to know if the proteolytic enzymes do in actual fact contribute to tenderisation. © 2015 Society of Chemical Industry.

  8. Is Collaborative, Community-Engaged Scholarship More Rigorous than Traditional Scholarship? On Advocacy, Bias, and Social Science Research

    Science.gov (United States)

    Warren, Mark R.; Calderón, José; Kupscznk, Luke Aubry; Squires, Gregory; Su, Celina

    2018-01-01

    Contrary to the charge that advocacy-oriented research cannot meet social science research standards because it is inherently biased, the authors of this article argue that collaborative, community-engaged scholarship (CCES) must meet high standards of rigor if it is to be useful to support equity-oriented, social justice agendas. In fact, they…

  9. Rigorous Line-Based Transformation Model Using the Generalized Point Strategy for the Rectification of High Resolution Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Kun Hu

    2016-09-01

    Full Text Available High precision geometric rectification of High Resolution Satellite Imagery (HRSI is the basis of digital mapping and Three-Dimensional (3D modeling. Taking advantage of line features as basic geometric control conditions instead of control points, the Line-Based Transformation Model (LBTM provides a practical and efficient way of image rectification. It is competent to build the mathematical relationship between image space and the corresponding object space accurately, while it reduces the workloads of ground control and feature recognition dramatically. Based on generalization and the analysis of existing LBTMs, a novel rigorous LBTM is proposed in this paper, which can further eliminate the geometric deformation caused by sensor inclination and terrain variation. This improved nonlinear LBTM is constructed based on a generalized point strategy and resolved by least squares overall adjustment. Geo-positioning accuracy experiments with IKONOS, GeoEye-1 and ZiYuan-3 satellite imagery are performed to compare rigorous LBTM with other relevant line-based and point-based transformation models. Both theoretic analysis and experimental results demonstrate that the rigorous LBTM is more accurate and reliable without adding extra ground control. The geo-positioning accuracy of satellite imagery rectified by rigorous LBTM can reach about one pixel with eight control lines and can be further improved by optimizing the horizontal and vertical distribution of control lines.

  10. Useful, Used, and Peer Approved: The Importance of Rigor and Accessibility in Postsecondary Research and Evaluation. WISCAPE Viewpoints

    Science.gov (United States)

    Vaade, Elizabeth; McCready, Bo

    2012-01-01

    Traditionally, researchers, policymakers, and practitioners have perceived a tension between rigor and accessibility in quantitative research and evaluation in postsecondary education. However, this study indicates that both producers and consumers of these studies value high-quality work and clear findings that can reach multiple audiences. The…

  11. Lessons learned from a rigorous peer-review process for building the Climate Literacy and Energy Awareness (CLEAN) collection of high-quality digital teaching materials

    Science.gov (United States)

    Gold, A. U.; Ledley, T. S.; McCaffrey, M. S.; Buhr, S. M.; Manduca, C. A.; Niepold, F.; Fox, S.; Howell, C. D.; Lynds, S. E.

    2010-12-01

    The topic of climate change permeates all aspects of our society: the news, household debates, scientific conferences, etc. To provide students with accurate information about climate science and energy awareness, educators require scientifically and pedagogically robust teaching materials. To address this need, the NSF-funded Climate Literacy & Energy Awareness Network (CLEAN) Pathway has assembled a new peer-reviewed digital collection as part of the National Science Digital Library (NSDL) featuring teaching materials centered on climate and energy science for grades 6 through 16. The scope and framework of the collection is defined by the Essential Principles of Climate Science (CCSP 2009) and a set of energy awareness principles developed in the project. The collection provides trustworthy teaching materials on these socially relevant topics and prepares students to become responsible decision-makers. While a peer-review process is desirable for curriculum developer as well as collection builder to ensure quality, its implementation is non-trivial. We have designed a rigorous and transparent peer-review process for the CLEAN collection, and our experiences provide general guidelines that can be used to judge the quality of digital teaching materials across disciplines. Our multi-stage review process ensures that only resources with teaching goals relevant to developing climate literacy and energy awareness are considered. Each relevant resource is reviewed by two individuals to assess the i) scientific accuracy, ii) pedagogic effectiveness, and iii) usability/technical quality. A science review by an expert ensures the scientific quality and accuracy. Resources that pass all review steps are forwarded to a review panel of educators and scientists who make a final decision regarding inclusion of the materials in the CLEAN collection. Results from the first panel review show that about 20% (~100) of the resources that were initially considered for inclusion

  12. Observation and visualization: reflections on the relationship between science, visual arts, and the evolution of the scientific image.

    Science.gov (United States)

    Kolijn, Eveline

    2013-10-01

    The connections between biological sciences, art and printed images are of great interest to the author. She reflects on the historical relevance of visual representations for science. She argues that the connection between art and science seems to have diminished during the twentieth century. However, this connection is currently growing stronger again through digital media and new imaging methods. Scientific illustrations have fuelled art, while visual modeling tools have assisted scientific research. As a print media artist, she explores the relationship between art and science in her studio practice and will present this historical connection with examples related to evolution, microbiology and her own work. Art and science share a common source, which leads to scrutiny and enquiry. Science sets out to reveal and explain our reality, whereas art comments and makes connections that don't need to be tested by rigorous protocols. Art and science should each be evaluated on their own merit. Allowing room for both in the quest to understand our world will lead to an enriched experience.

  13. Marie Curie: scientific entrepreneur

    International Nuclear Information System (INIS)

    Boudia, S.

    1998-01-01

    Marie Curie is best known for her discovery of radium one hundred years ago this month, but she also worked closely with industry in developing methods to make and monitor radioactive material, as Soraya Boudia explains. One hundred years ago this month, on 28 December 1898, Pierre Curie, Marie Sklodowska-Curie and Gustave Bemont published a paper in Comptes-rendus - the journal of the French Academy of Sciences. In the paper they announced that they had discovered a new element with astonishing properties: radium. But for one of the authors, Marie Curie, the paper was more than just the result of outstanding work: it showed that a woman could succeed in what was then very much a male-dominated scientific world. Having arrived in Paris from Poland in 1891, Marie Curie became the first woman in France to obtain a PhD in physics, the first woman to win a Nobel prize and the first woman to teach at the Sorbonne. She also helped to found a new scientific discipline: the study of radioactivity. She became an icon and a role-model for other women to follow, someone who succeeded - despite many difficulties - in imposing herself on the world of science. Although Curie's life story is a familiar and well documented one, there is one side to her that is less well known: her interaction with industry. As well as training many nuclear physicists and radiochemists in her laboratory, she also became a scientific pioneer in industrial collaboration. In this article the author describes this side of Marie Curie. (UK)

  14. Scientific Investigation with the SJCSI

    Science.gov (United States)

    Berbey, E.; Delpeyroux, G.; Douay, E.; Juchereau, C.; Garavet, O.

    2012-04-01

    Scientific Investigation with the SJCSI (Saint Jean* Crime Scene Investigation) Our work, which we have been teaching for 3 years, consists of a scientific investigation. We create a case from A to Z and then our students (15 to 16 years old) are meant to collect samples and clues from a reconstituted crime scene and then have to catch the culprit thanks to laboratory tests crossing four subjects: Physics and Chemistry, Biology, Math and English. I'm a biology teacher and I work with 3 other teachers in my school. The objectives of these activities are: • Make sciences more attractive by putting them into a context of crime investigation. • Use science techniques to find a culprit or to clear a suspect. • To acquire scientific knowledge. • Realize that the different scientific subjects complement each other to carry out a survey. • Use English language and improve it. The investigation consists of doing experiments after collecting different samples and clues on the crime scene. Examples of Biology experimentation: • Detecting the origin of the blood samples found on the crime scene. Students observe blood samples with a microscope and compare the characteristics to those of human blood found on the web. They discover that blood samples found aren't human blood because the red cells have a nucleus. By using the information given in the scenario, they discover that blood sample belongs to the parrot of a suspect. Students, also take a photo of their microscopic preparations, add title and caption and so they learn the cell's structure and the characteristics of blood cells. • In another case, students have to study the blood sample found under the victims fingernails. They observe blood preparation and compare it to the blood of a suspect who has a genetic disease: drepanocytosis. So, they discover the characteristics of blood cells by comparing them to sickle cells. • DNA electrophoresis to identify DNA found, for example, on the gun. • Blood type

  15. What do we do about women athletes with testes?

    Science.gov (United States)

    Newbould, Melanie Joy

    2016-04-01

    Elite sport and the measures imposed to prevent 'men' from 'cheating' by posing as women in women's events cast interesting light on notions of sex and gender. Some women have testes, organs that produce testosterone, because they are trans women or they have an intersex state. Testosterone is recognised as a performance-enhancing substance in at least some circumstances, and therefore, women with testes may possess an advantage when competing in some sport against women without testes, though this has never been subjected to rigorous scientific testing. The International Olympic Committee and the International Association of Athletics Federation have decreed that such individuals can compete only if they undergo medical and surgical treatment, which is likely to mean gonadectomy. This might be considered to impose an unethical demand on the individual concerned and constitute an infringement of bodily autonomy for that individual. It also suggests a binary view of sex/gender that is simplistic and not scientifically accurate. I discuss this approach and consider alternative methods of approaching the problem of women with testes in athletics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. Scientific (Wo)manpower?

    DEFF Research Database (Denmark)

    Amilon, Anna; Persson, Inga

    2013-01-01

    Purpose – The purpose of this paper is to investigate to what extent male and female PhDs choose academic vs non‐academic employment. Further, it analyses gender earnings differences in the academic and non‐academic labour markets. Design/methodology/approach – Rich Swedish cross‐sectional regist...... scientific human capital. Originality/value – The study is the first to investigate career‐choice and earnings of Swedish PhDs. Further, the study is the first to investigate both the academic and the non‐academic labour markets....

  17. Scientific report 1999

    International Nuclear Information System (INIS)

    2000-01-01

    This scientific report of the Fuel Cycle Direction of the Cea, presents the Direction activities and research programs in the fuel cycle domain during the year 1999. The first chapter is devoted to the front end of the fuel cycle with the SILVA process as main topic. The second chapter is largely based on the separation chemistry of the back end cycle. The third and fourth chapters present studies of more applied and sometimes more technical developments in the nuclear industry or not. (A.L.B.)

  18. Scientific report 1997

    International Nuclear Information System (INIS)

    Gosset, J.; Gueneau, C.; Doizi, D.

    1998-01-01

    In this book are found technical and scientific papers on the main works of the Direction of the Fuel Cycle (DCC) in France. The study fields are: the up-side of the nuclear fuel cycle with theoretical studies (plasma simulation) and technological developments and instrumentation (lasers diodes, carbides plasma projection, carbon 13 enrichment); the down-side nuclear fuel cycle with theoretical studies (ion Eu 3+ complexation simulation, decay simulation, uranium and plutonium diffusion study, electrolyser operating simulation), scenario studies ( recycling, wastes management), experimental studies; dismantling and cleaning (soils cleaning, surface-active agent for decontamination, fault tree analysis); analysis with expert systems and mass spectrometry. (A.L.B.)

  19. SCIENTIFIC BASIS OF DENTISTRY

    Directory of Open Access Journals (Sweden)

    Yegane GÜVEN

    2017-10-01

    Full Text Available Technological and scientific innovations have increased exponentially over the past years in the dentistry profession. In this article, these developments are evaluated both in terms of clinical practice and their place in the educational program. The effect of the biologic and digital revolutions on dental education and daily clinical practice are also reviewed. Biomimetics, personalized dental medicine regenerative dentistry, nanotechnology, high-end simulations providing virtual reality, genomic information, and stem cell studies will gain more importance in the coming years, moving dentistry to a different dimension.

  20. Practical scientific computing

    CERN Document Server

    Muhammad, A

    2011-01-01

    Scientific computing is about developing mathematical models, numerical methods and computer implementations to study and solve real problems in science, engineering, business and even social sciences. Mathematical modelling requires deep understanding of classical numerical methods. This essential guide provides the reader with sufficient foundations in these areas to venture into more advanced texts. The first section of the book presents numEclipse, an open source tool for numerical computing based on the notion of MATLAB®. numEclipse is implemented as a plug-in for Eclipse, a leading integ

  1. Scientific activities 1979

    International Nuclear Information System (INIS)

    1981-01-01

    The scientific activities and achievements of the Nuclear Research Center Democritus for the year 1979 are presented in the form of a list of 78 projects giving title, objectives, commencement year, responsible of each project, developed activities and the pertaining lists of publications. The 15 chapters of this work cover the activities of the main Divisions of the Democritus NRC: Electronics, Biology, Physics, Chemistry, Health Physics, Reactor, Radioisotopes, Environmental Radioactivity, Soil Science, Computer Center, Uranium Exploration, Medical Service, Technological Applications and Training. (T.A.)

  2. Energy and scientific communication

    Science.gov (United States)

    De Sanctis, E.

    2013-06-01

    Energy communication is a paradigmatic case of scientific communication. It is particularly important today, when the world is confronted with a number of immediate, urgent problems. Science communication has become a real duty and a big challenge for scientists. It serves to create and foster a climate of reciprocal knowledge and trust between science and society, and to establish a good level of interest and enthusiasm for research. For an effective communication it is important to establish an open dialogue with the audience, and a close collaboration among scientists and science communicators. An international collaboration in energy communication is appropriate to better support international and interdisciplinary research and projects.

  3. Scientific visualization and radiology

    International Nuclear Information System (INIS)

    Lawrance, D.P.; Hoyer, C.E.; Wrestler, F.A.; Kuhn, M.J.; Moore, W.D.; Anderson, D.R.

    1989-01-01

    Scientific visualization is the visual presentation of numerical data. The National Center for Supercomputing Applications (NCSA) has developed methods for visualizing computerbased simulations of digital imaging data. The applicability of these various tools for unique and potentially medical beneficial display of MR images is investigated. Raw data are obtained from MR images of the brain, neck, spine, and brachial plexus obtained on a 1.5-T imager with multiple pulse sequences. A supercomputer and other mainframe resources run a variety of graphic and imaging programs using this data. An interdisciplinary team of imaging scientists, computer graphic programmers, an physicians works together to achieve useful information

  4. Interoperable Data Sharing for Diverse Scientific Disciplines

    Science.gov (United States)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  5. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  6. The Scientific Case against Astrology.

    Science.gov (United States)

    Kelly, Ivan

    1980-01-01

    Discussed is the lack of a scientific foundation and scientific evidence favoring astrology. Included are several research studies conducted to examine astrological tenets which yield generally negative results. (Author/DS)

  7. Expectations for a scientific collaboratory

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    2003-01-01

    In the past decade, a number of scientific collaboratories have emerged, yet adoption of scientific collaboratories remains limited. Meeting expectations is one factor that influences adoption of innovations, including scientific collaboratories. This paper investigates expectations scientists have...... with respect to scientific collaboratories. Interviews were conducted with 17 scientists who work in a variety of settings and have a range of experience conducting and managing scientific research. Results indicate that scientists expect a collaboratory to: support their strategic plans; facilitate management...... of the scientific process; have a positive or neutral impact on scientific outcomes; provide advantages and disadvantages for scientific task execution; and provide personal conveniences when collaborating across distances. These results both confirm existing knowledge and raise new issues for the design...

  8. Scientific inference learning from data

    CERN Document Server

    Vaughan, Simon

    2013-01-01

    Providing the knowledge and practical experience to begin analysing scientific data, this book is ideal for physical sciences students wishing to improve their data handling skills. The book focuses on explaining and developing the practice and understanding of basic statistical analysis, concentrating on a few core ideas, such as the visual display of information, modelling using the likelihood function, and simulating random data. Key concepts are developed through a combination of graphical explanations, worked examples, example computer code and case studies using real data. Students will develop an understanding of the ideas behind statistical methods and gain experience in applying them in practice. Further resources are available at www.cambridge.org/9781107607590, including data files for the case studies so students can practise analysing data, and exercises to test students' understanding.

  9. Metadata in Scientific Dialects

    Science.gov (United States)

    Habermann, T.

    2011-12-01

    Discussions of standards in the scientific community have been compared to religious wars for many years. The only things scientists agree on in these battles are either "standards are not useful" or "everyone can benefit from using my standard". Instead of achieving the goal of facilitating interoperable communities, in many cases the standards have served to build yet another barrier between communities. Some important progress towards diminishing these obstacles has been made in the data layer with the merger of the NetCDF and HDF scientific data formats. The universal adoption of XML as the standard for representing metadata and the recent adoption of ISO metadata standards by many groups around the world suggests that similar convergence is underway in the metadata layer. At the same time, scientists and tools will likely need support for native tongues for some time. I will describe an approach that combines re-usable metadata "components" and restful web services that provide those components in many dialects. This approach uses advanced XML concepts of referencing and linking to construct complete records that include reusable components and builds on the ISO Standards as the "unabridged dictionary" that encompasses the content of many other dialects.

  10. Budapest scientific a guidebook

    CERN Document Server

    Hargittai, István

    2015-01-01

    This guidebook introduces the reader—the scientific tourist and others—to the visible memorabilia of science and scientists in Budapest—statues, busts, plaques, buildings, and other artefacts. According to the Hungarian–American Nobel laureate Albert Szent-Györgyi, this metropolis at the crossroads of Europe has a special atmosphere of respect for science. It has been the venue of numerous scientific achievements and the cradle, literally, of many individuals who in Hungary, and even more beyond its borders became world-renowned contributors to science and culture. Six of the eight chapters of the book cover the Hungarian Nobel laureates, the Hungarian Academy of Sciences, the university, the medical school, agricultural sciences, and technology and engineering. One chapter is about selected gimnáziums from which seven Nobel laureates (Szent-Györgyi, de Hevesy, Wigner, Gabor, Harsanyi, Olah, and Kertész) and the five “Martians of Science” (von Kármán, Szilard, Wigner, von Neumann, and Teller...

  11. Compendium of Scientific Linacs

    Energy Technology Data Exchange (ETDEWEB)

    Clendenin, James E

    2003-05-16

    The International Committee supported the proposal of the Chairman of the XVIII International Linac Conference to issue a new Compendium of linear accelerators. The last one was published in 1976. The Local Organizing Committee of Linac96 decided to set up a sub-committee for this purpose. Contrary to the catalogues of the High Energy Accelerators which compile accelerators with energies above 1 GeV, we have not defined a specific limit in energy. Microtrons and cyclotrons are not in this compendium. Also data from thousands of medical and industrial linacs has not been collected. Therefore, only scientific linacs are listed in the present compendium. Each linac found in this research and involved in a physics context was considered. It could be used, for example, either as an injector for high energy accelerators, or in nuclear physics, materials physics, free electron lasers or synchrotron light machines. Linear accelerators are developed in three continents only: America, Asia, and Europe. This geographical distribution is kept as a basis. The compendium contains the parameters and status of scientific linacs. Most of these linacs are operational. However, many facilities under construction or design studies are also included. A special mention has been made at the end for the studies of future linear colliders.

  12. Verified scientific findings

    International Nuclear Information System (INIS)

    Bullinger, M.G.

    1982-01-01

    In this essay, the author attempts to enlighten the reader as to the meaning of the term ''verified scientific findings'' in section 13, sub-section 1, sentence 2 of the new Chemicals Control Law. The examples given here are the generally accepted regulations in regards to technology (that is sections 7a and 18b of the WHG (law on water economy), section 3, sub-section 1 of the machine- and engine protection laws) and to the status of technology (section 3, sub-section 6 of the BImSchG (Fed. law on prevention of air-borne pollution)), and to the status of science (section 5, sub-section 2 of the AMG (drug legislation). The ''status of science and technology'' as defined in sections 4 ff of the Atomic Energy Law (AtomG) and in sections 3, 4, 12, 2) of the First Radiation Protection Ordinance (1.StrlSch. VO), is also being discussed. The author defines the in his opinion ''dynamic term'' as the generally recognized result of scientific research, and the respective possibilities of practical utilization of technology. (orig.) [de

  13. Drilling for scientific purpose

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Shoichi

    1987-09-01

    Drilling for scientific purpose is a process of conducting geophysical exploration at deep underground and drilling for collecting crust samples directly. This is because earth science has advanced to get a good understanding about the top of the crust and has shifted its main interest to the lower layer of the crust in land regions. The on-land drilling plan in Japan has just started, and the planned drilling spots are areas around the Minami River, Hidaka Mts., kinds of the Mesozoic and Cenozoic granite in outside zone, the extension of Japan Sea, Ogasawara Is., Minami-Tori Is., and active volcanos. The paper also outlines the present situation of on-land drilling in the world, focusing on the SG-3rd super-deep well SG-3 on the Kola Peninsula, USSR, Satori SG-1st well SG-1 in Azerbaidzhan S.S.R, V.S.S.R, Sweden's wells, Cyprus' wells, Bayearn well Plan in West Germany, and Salton Sea Scientific Drilling Program in the U.S. At its end, the paper explains the present situation and the future theme of the Japanese drilling technique and points out the necessity of developing equipment, and techniques. (14 figs, 5 tabs, 26 refs)

  14. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation.

    Science.gov (United States)

    Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A

    2015-01-08

    science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.

  15. Rigorous study of the gap equation for an inhomogeneous superconducting state near T/sub c/

    International Nuclear Information System (INIS)

    Hu, C.

    1975-01-01

    A rigorous analytic study of the self-consistent gap equation (symobolically Δ=F/sub T/Δ), for an inhomogeneous superconducting state, is presented in the Bogoliubov formulation. The gap function Δ (r) is taken to simulate a planar normal-superconducting phase boundary: Δ (r) =Δ/sub infinity/ tanh(αΔ/sub infinity/z/v/sub F/) THETA (z), where Δ/sub infinity/(T) is the equilibrium gap, v/subF/ is the Fermi velocity, and THETA (z) is a unit step function. First a special space integral of the gap equation proportional∫ 0 /sub +//sup infinity/(F/sub T/-Δ)(dΔ/dz) dz is evaluated essentially exactly, except for a nonperturbative WKBJ approximation used in solving the Bogoliubov--de Gennes equations. It is then expanded near the transition temperature T/sub c/ in power of Δ/sub infinity/proportional (1-T/T/sub c/) 1 / 2 , demonstrating an exact cancellation of a subseries of ''anomalous-order'' terms. The leading surviving term is found to agree in order, but not in magnitude, with the Ginzburg-Landau-Gor'kov (GLG) approximation. The discrepancy is found to be linked to the slope discontinuity in our chosen Δ. A contour-integral technique in a complex-energy plane is then devised to evaluate the local value of F/sub T/-Δ exactly. Our result reveals that near T/sub c/ this method can reproduce the GLG result essentially everywhere, except within a BCS coherence length not xi (T) exclamation from a singularity in Δ, where F/sub T/-Δ can have a singular contribution with an ''anomalous'' local magnitude, not expected from the GLG approach. This anomalous term precisely accounts for the discrepancy found in the special integral of the gap equation as mentioned above, and likely explains the ultimate origin of the anomalous terms found in the free energy of an isolated vortex line by Cleary

  16. Ar-Ar_Redux: rigorous error propagation of 40Ar/39Ar data, including covariances

    Science.gov (United States)

    Vermeesch, P.

    2015-12-01

    Rigorous data reduction and error propagation algorithms are needed to realise Earthtime's objective to improve the interlaboratory accuracy of 40Ar/39Ar dating to better than 1% and thereby facilitate the comparison and combination of the K-Ar and U-Pb chronometers. Ar-Ar_Redux is a new data reduction protocol and software program for 40Ar/39Ar geochronology which takes into account two previously underappreciated aspects of the method: 1. 40Ar/39Ar measurements are compositional dataIn its simplest form, the 40Ar/39Ar age equation can be written as: t = log(1+J [40Ar/39Ar-298.5636Ar/39Ar])/λ = log(1 + JR)/λ Where λ is the 40K decay constant and J is the irradiation parameter. The age t does not depend on the absolute abundances of the three argon isotopes but only on their relative ratios. Thus, the 36Ar, 39Ar and 40Ar abundances can be normalised to unity and plotted on a ternary diagram or 'simplex'. Argon isotopic data are therefore subject to the peculiar mathematics of 'compositional data', sensu Aitchison (1986, The Statistical Analysis of Compositional Data, Chapman & Hall). 2. Correlated errors are pervasive throughout the 40Ar/39Ar methodCurrent data reduction protocols for 40Ar/39Ar geochronology propagate the age uncertainty as follows: σ2(t) = [J2 σ2(R) + R2 σ2(J)] / [λ2 (1 + R J)], which implies zero covariance between R and J. In reality, however, significant error correlations are found in every step of the 40Ar/39Ar data acquisition and processing, in both single and multi collector instruments, during blank, interference and decay corrections, age calculation etc. Ar-Ar_Redux revisits every aspect of the 40Ar/39Ar method by casting the raw mass spectrometer data into a contingency table of logratios, which automatically keeps track of all covariances in a compositional context. Application of the method to real data reveals strong correlations (r2 of up to 0.9) between age measurements within a single irradiation batch. Propertly taking

  17. Peridynamics as a rigorous coarse-graining of atomistics for multiscale materials design

    International Nuclear Information System (INIS)

    Lehoucq, Richard B.; Aidun, John Bahram; Silling, Stewart Andrew; Sears, Mark P.; Kamm, James R.; Parks, Michael L.

    2010-01-01

    This report summarizes activities undertaken during FY08-FY10 for the LDRD Peridynamics as a Rigorous Coarse-Graining of Atomistics for Multiscale Materials Design. The goal of our project was to develop a coarse-graining of finite temperature molecular dynamics (MD) that successfully transitions from statistical mechanics to continuum mechanics. The goal of our project is to develop a coarse-graining of finite temperature molecular dynamics (MD) that successfully transitions from statistical mechanics to continuum mechanics. Our coarse-graining overcomes the intrinsic limitation of coupling atomistics with classical continuum mechanics via the FEM (finite element method), SPH (smoothed particle hydrodynamics), or MPM (material point method); namely, that classical continuum mechanics assumes a local force interaction that is incompatible with the nonlocal force model of atomistic methods. Therefore FEM, SPH, and MPM inherit this limitation. This seemingly innocuous dichotomy has far reaching consequences; for example, classical continuum mechanics cannot resolve the short wavelength behavior associated with atomistics. Other consequences include spurious forces, invalid phonon dispersion relationships, and irreconcilable descriptions/treatments of temperature. We propose a statistically based coarse-graining of atomistics via peridynamics and so develop a first of a kind mesoscopic capability to enable consistent, thermodynamically sound, atomistic-to-continuum (AtC) multiscale material simulation. Peridynamics (PD) is a microcontinuum theory that assumes nonlocal forces for describing long-range material interaction. The force interactions occurring at finite distances are naturally accounted for in PD. Moreover, PDs nonlocal force model is entirely consistent with those used by atomistics methods, in stark contrast to classical continuum mechanics. Hence, PD can be employed for mesoscopic phenomena that are beyond the realms of classical continuum mechanics and

  18. The Scientific Competitiveness of Nations.

    Science.gov (United States)

    Cimini, Giulio; Gabrielli, Andrea; Sylos Labini, Francesco

    2014-01-01

    We use citation data of scientific articles produced by individual nations in different scientific domains to determine the structure and efficiency of national research systems. We characterize the scientific fitness of each nation-that is, the competitiveness of its research system-and the complexity of each scientific domain by means of a non-linear iterative algorithm able to assess quantitatively the advantage of scientific diversification. We find that technological leading nations, beyond having the largest production of scientific papers and the largest number of citations, do not specialize in a few scientific domains. Rather, they diversify as much as possible their research system. On the other side, less developed nations are competitive only in scientific domains where also many other nations are present. Diversification thus represents the key element that correlates with scientific and technological competitiveness. A remarkable implication of this structure of the scientific competition is that the scientific domains playing the role of "markers" of national scientific competitiveness are those not necessarily of high technological requirements, but rather addressing the most "sophisticated" needs of the society.

  19. Should scientific realists be platonists?

    DEFF Research Database (Denmark)

    Busch, Jacob; Morrison, Joe

    2015-01-01

    an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...

  20. EFSA Scientific Committee; Scientific Opinion on Risk Assessment Terminology

    DEFF Research Database (Denmark)

    Hald, Tine

    of improving the expression and communication of risk and/or uncertainties in the selected opinions. The Scientific Committee concluded that risk assessment terminology is not fully harmonised within EFSA. In part this is caused by sectoral legislation defining specific terminology and international standards......The Scientific Committee of the European Food Safety Authority (EFSA) reviewed the use of risk assessment terminology within its Scientific Panels. An external report, commissioned by EFSA, analysed 219 opinions published by the Scientific Committee and Panels to recommend possible ways......, the Scientific Committee concludes that particular care must be taken that the principles of CAC, OIE or IPPC are followed strictly. EFSA Scientific Panels should identify which specific approach is most useful in dealing with their individual mandates. The Scientific Committee considered detailed aspects...

  1. The Associative Basis of Scientific Creativity: A Model Proposal

    Directory of Open Access Journals (Sweden)

    Esra Kanli

    2014-06-01

    Full Text Available Creativity is accepted as an important part of scientific skills. Scientific creativity proceeds from a need or urge to solve a problem, and in-volves the production of original and useful ideas or products. Existing scientific creativity theories and tests do not feature the very im-portant thinking processes, such as analogical and associative thinking, which can be consid-ered crucial in creative scientific problem solv-ing. Current study’s aim is to provide an alter-native model and explicate the associative basis of scientific creativity. Emerging from the re-viewed theoretical framework, Scientific Asso-ciations Model is proposed. This model claims that, similarity and mediation constitutes the basis of creativity and focuses on three compo-nents namely; associative thinking, analogical thinking (analogical reasoning & analogical problem solving and insight which are consid-ered to be main elements of scientific associa-tive thinking.

  2. Undergraduate honors students' images of science: Nature of scientific work and scientific knowledge

    Science.gov (United States)

    Wallace, Michael L.

    This exploratory study assessed the influence of an implicit, inquiry-oriented nature of science (NOS) instructional approach undertaken in an interdisciplinary college science course on undergraduate honor students' (UHS) understanding of the aspects of NOS for scientific work and scientific knowledge. In this study, the nature of scientific work concentrated upon the delineation of science from pseudoscience and the value scientists place on reproducibility. The nature of scientific knowledge concentrated upon how UHS view scientific theories and how they believe scientists utilize scientific theories in their research. The 39 UHS who participated in the study were non-science majors enrolled in a Honors College sponsored interdisciplinary science course where the instructors took an implicit NOS instructional approach. An open-ended assessment instrument, the UFO Scenario, was designed for the course and used to assess UHS' images of science at the beginning and end of the semester. The mixed-design study employed both qualitative and quantitative techniques to analyze the open-ended responses. The qualitative techniques of open and axial coding were utilized to find recurring themes within UHS' responses. McNemar's chi-square test for two dependent samples was used to identify whether any statistically significant changes occurred within responses from the beginning to the end of the semester. At the start of the study, the majority of UHS held mixed NOS views, but were able to accurately define what a scientific theory is and explicate how scientists utilize theories within scientific research. Postinstruction assessment indicated that UHS did not make significant gains in their understanding of the nature of scientific work or scientific knowledge and their overall images of science remained static. The results of the present study found implicit NOS instruction even with an extensive inquiry-oriented component was an ineffective approach for modifying UHS

  3. Scientific review of psychophysiological detection of deceit

    Directory of Open Access Journals (Sweden)

    Igor Areh

    2016-10-01

    Full Text Available Psychophysiological detection of deceit has been in the centre of attention in the recent decade, which correlates with heightened security challenges of a modern world. The article provides scientific discussion about polygraph that is used in criminal investigation. Two most employed polygraph techniques are critically presented, examined and compared: the Comparison Question Test (CQT and the Concealed Information Test (CIT. Theoretical foundations, objectivity and standardization of testing procedures, ethical and practical issues are analysed. Proponents of the Comparison Question Test have not been successful in their efforts to resolve fundamental problems and limitations with which the technique is challenged. It remains unstandardized and unscientific, separated from science and mainly without attempts to escape from the dead-end. The most influential theoretical backgrounds of CQT technique are examined; however, none of them represents a satisfactory scientific foundation of the technique. Without being scientifically grounded in a verifiable theory, it remains controversial and caught into self-sufficiency, mostly supported by methodologically questionable research findings gained by proponents. To the contrary, the Concealed Information Test is associated with fast development, particularly in the field of neurology, and is considered to be less disputed and to be partly supported by a sound scientific ground. Applying the Concealed Question Test, somewhat naïve and disputable detection of lies typical of the Comparison Question Test is replaced by a search for information that lies concealed in the suspects’ memory. However, the Concealed Information Test also has been challenged by serious deficiencies, which bring forward a question about justification of the use of the polygraph.

  4. Test and survey on a next generation coal liquefying catalyst. Coal molecule scientific test and survey as the base for commercializing the coal liquefying technology; Jisedai sekitan ekika shokubai shiken chosa. Sekitan ekika gijutsu shogyoka kiban to shite no sekitan bunshi kagaku shiken chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The test and survey on a next generation coal liquefying catalyst present a new proposal to raise catalytic activity in coal liquefaction, and perform demonstration experiments in a laboratory scale to search for possibility of developing a new coal liquefying catalyst from various viewpoints. To explain, discussions were given on the catalyst to perform the followings: liquefaction under extremely mild conditions by using ultra strong acids not limited only to metals; ion exchange method and swell carrying method to raise catalyst dispersion very highly, enhance the catalytic activity, and reduce the amount of catalyst to be used; mechanism of producing catalyst activating species to further enhance the activity of iron catalysts; and pursuit of morphological change in the activating species. The coal molecule scientific test and survey as the base for commercializing the coal liquefying technology performed the studies on the following items: pretreatment of coal that can realize reduction of coal liquefaction cost; configuration of the liquefaction reaction, liquefying catalysts, hydrocarbon gas generating mechanism, status of catalysts after liquefaction reaction, and reduction in gas purification cost by using gas separating membranes. Future possibilities were further searched through frank and constructive opinion exchanges among the committee members. (NEDO)

  5. Final Scientific EFNUDAT Workshop

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluationCross section measurementsExperimental techniquesUncertainties and covariancesFission propertiesCurrent and future facilities  International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco CalvianiSamuel AndriamonjeEric BerthoumieuxCarlos GuerreroRoberto LositoVasilis Vlachoudis Workshop Assistant: Géraldine Jean

  6. Scientific developments ISFD3

    Science.gov (United States)

    Schropp, M.H.I.; Soong, T.W.

    2006-01-01

    Highlights, trends, and consensus from the 63 papers submitted to the Scientific Developments theme of the Third International Symposium on Flood Defence (ISFD) are presented. Realizing that absolute protection against flooding can never be guaranteed, trends in flood management have shifted: (1) from flood protection to flood-risk management, (2) from reinforcing structural protection to lowering flood levels, and (3) to sustainable management through integrated problem solving. Improved understanding of watershed responses, climate changes, applications of GIS and remote-sensing technologies, and advanced analytical tools appeared to be the driving forces for renewing flood-risk management strategies. Technical competence in integrating analytical tools to form the basin wide management systems are demonstrated by several large, transnation models. However, analyses from social-economic-environmental points of view are found lag in general. ?? 2006 Taylor & Francis Group.

  7. Dishonesty in scientific research.

    Science.gov (United States)

    Mazar, Nina; Ariely, Dan

    2015-11-02

    Fraudulent business practices, such as those leading to the Enron scandal and the conviction of Bernard Madoff, evoke a strong sense of public outrage. But fraudulent or dishonest actions are not exclusive to the realm of big corporations or to evil individuals without consciences. Dishonest actions are all too prevalent in everyone's daily lives, because people are constantly encountering situations in which they can gain advantages by cutting corners. Whether it's adding a few dollars in value to the stolen items reported on an insurance claim form or dropping outlier data points from a figure to make a paper sound more interesting, dishonesty is part of the human condition. Here, we explore how people rationalize dishonesty, the implications for scientific research, and what can be done to foster a culture of research integrity.

  8. Dishonesty in scientific research

    Science.gov (United States)

    Mazar, Nina; Ariely, Dan

    2015-01-01

    Fraudulent business practices, such as those leading to the Enron scandal and the conviction of Bernard Madoff, evoke a strong sense of public outrage. But fraudulent or dishonest actions are not exclusive to the realm of big corporations or to evil individuals without consciences. Dishonest actions are all too prevalent in everyone’s daily lives, because people are constantly encountering situations in which they can gain advantages by cutting corners. Whether it’s adding a few dollars in value to the stolen items reported on an insurance claim form or dropping outlier data points from a figure to make a paper sound more interesting, dishonesty is part of the human condition. Here, we explore how people rationalize dishonesty, the implications for scientific research, and what can be done to foster a culture of research integrity. PMID:26524587

  9. Annual scientific report 1978

    International Nuclear Information System (INIS)

    Proost, J.; Billiau, R.; Kirk, F.

    1979-01-01

    This report of the Centre d'Etude de l'Energie Nucleaire - Studiecentrum voor Kernenergie gives a survey of the scientific and technical work done in 1978. The research areas are: 1. The sodium cooled fast reactor and namely the mixed oxide fuels, the carbide fuel, the materials development, the reprocessing, the fast reactor physics, the safety and instrumentation and the sodium technology. 2. The gas cooled reactors as gas cooled fast and high temperature reactors. 3. The light water reactors namely the BR3 reactor, the light water reactor fuels and the plutonium recycling. 4. The applied nuclear research, waste conditioning and disposal as the safeguards, the fusion research and the lithium technology. 5. The basic and exploratory research namely the materials science and the nuclear physics and finally 6. Non-nuclear research and development such as the air pollution, the pollution abatement and waste handling, the fuel cells and applied electrochemistry. (AF)

  10. Ben Franklin's Scientific Amusements

    Science.gov (United States)

    Herschbach, Dudley

    2003-04-01

    As an American icon, Benjamin Franklin is often portrayed as wise and canny in business and politics, earnestly pursuing and extolling diligence, sensible conduct, pragmatism, and good works. Also legendary are some of his inventions, particularly the lightning rod, bifocals, and an efficient wood-burning stove. The iconic image is misleading in major respects. Today, surprisingly few people appreciate that, in the 18th century, Franklin was greatly esteemed throughout Europe as a scientist (termed then a "natural philosopher.") He was hailed as the "Newton of Electricity." Indeed, until Franklin, electricity seemed more mysterious than had gravity in Newton's time, and lightning was considered the wrath of God. By his own account, Franklin's studies of electricity and many other phenomena were prompted not by practical aims, but by his playful curiosity--which often became obsessive. Also not generally appreciated is the importance of Franklin's scientific reputation in enhancing his efforts to obtain French support for the American Revolution.

  11. Ethics in Scientific Publishing

    Science.gov (United States)

    Sage, Leslie J.

    2012-08-01

    We all learn in elementary school not turn in other people's writing as if it were our own (plagiarism), and in high school science labs not to fake our data. But there are many other practices in scientific publishing that are depressingly common and almost as unethical. At about the 20 percent level authors are deliberately hiding recent work -- by themselves as well as by others -- so as to enhance the apparent novelty of their most recent paper. Some people lie about the dates the data were obtained, to cover up conflicts of interest, or inappropriate use of privileged information. Others will publish the same conference proceeding in multiple volumes, or publish the same result in multiple journals with only trivial additions of data or analysis (self-plagiarism). These shady practices should be roundly condemned and stopped. I will discuss these and other unethical actions I have seen over the years, and steps editors are taking to stop them.

  12. Annual scientific report 1977

    International Nuclear Information System (INIS)

    Proost, J.; Billiau, R.; Kirk, F.

    1978-01-01

    This report of the Centre d'Etude de l'Energie Nucleaire - Studiecentrum voor Kernenergie gives a survey of the scientific and technical work done in 1977. The research areas are: 1. The sodium cooled fast reactors and namely the mixed oxide fuels, the carbide fuel, the materials development, the reprocessing, the fast reactor physics, the safety and instrumentation and the sodium technology. 2. The gas cooled reactors as gas cooled fast and high temperature reactors. 3. The light water reactors namely the BR3 reactor, the light water reactor fuels and the plutonium recycling. 4. The applied nuclear research, waste conditioning and disposal as the safeguards, the fusion research and the lithium technology. 5. The basic and exploraty research namely the materials science and the nuclear physics and finally 6. Non-nuclear reseach and development such as the air pollution, the pollution abatement and waste handling, the fuel cells and applied electrochemistry. (AF)

  13. Annual scientific report 1976

    International Nuclear Information System (INIS)

    Billiau, R.; Kirk, F.; Proost, J.

    1977-01-01

    This report of the Centre d'Etude de l'Energie Nucleaire - Studiecentrum voor Kernenergie gives a survey of the scientific and technical work done in 1976. The research areas are: 1. The sodium cooled fast reactors and namely the mixed oxide fuels, the carbide fuel, the materials development, the reprocessing, the fast reactor physics, the safety and instrumentation and the sodium technology. 2. The gas cooled reactors as gas cooled fast and high temperature reactors. 3. The light water reactors namely the BR3 reactor, the light water reactor fuels and the plutonium recycling. 4. The applied nuclear research, waste conditioning and disposal as the safeguards, the fusion research and the lithium technology. 5. The basic and exploratory research namely the materials science and the nuclear physics and finally 6. Non-nuclear research and development such as the air pollution, the pollution abatement and waste handling, the fuel cells and applied electrochemistry

  14. Annual scientific report 1980

    International Nuclear Information System (INIS)

    Billiau, R.; Proost, J.

    This report of the Centre d'Etude de l'Energie Nucleaire - Studiecentrum voor Kernenergie - gives a survey of the scientific and technical work done in 1980. The research areas are: 1. The sodium cooled fast reactor and namely the mixed oxide fuels, the carbide fuel, the materials development, the reprocessing, the fast reactor physics; the safety and instrumentation and the sodium technology. 2. The gas cooled reactors as gas cooled fast and high temperature reactors. 3. The light water reactors, namely the BR3 reactor, the light water reactor fuels and the plutonium recycling. 4. The applied nuclear research, waste conditioning and disposal as the safeguards, the fusion research and the lithium technology. 5. The basis and exploratory research namely the materials science and the nuclear physics and finally 6. Non-nuclear research and development such as the air pollution, the pollution abatement and waste handling, the fuel cells and applied electrochemistry. (AF)

  15. Scientific journal cancellations

    CERN Multimedia

    The Library

    2001-01-01

    Earlier this year the Scientific Information Policy Board (SIPB) requested the Library and the Working Group for Acquisitions to revise the current printed journal collection in order to cancel those titles that are less required. Savings could then be used for the development of other collections and particularly electronic resources needed to support CERN current research activities. A list of proposed cancellations was drawn and posted on the Library web pages: http://library.cern.ch/library_general/cancel.html The SIPB invites every one to check if any of the titles are of importance to their work, in which case you are invited to inform the Library before the 25th of September by sending an e-mail to: eliane.chaney@cern.ch Titles not reconsidered by the users will be cancelled by the end of the year. Thank you, The Library

  16. Apollo's scientific legacy

    International Nuclear Information System (INIS)

    Meadows, J.

    1979-01-01

    The scientific value and importance of the Apollo lunar programme is assessed in the light of data obtained both from the lunar surface itself and also from the command modules which orbited above. It is stated that much of the material they returned still awaits a detailed examination and that the cooperative teams set up to handle the lunar material have established new methods and standards of analysis, which are currently revitalising the old science of meteoritics. The new forms of organised research have also been carried over in the rapidly developing subject of planetary science. It is concluded that whatever the motives for launching the Apollo missions, planetary scientists have been in a much better position to understand the Solar System since then. (UK)

  17. The Uncertain of Scientific Process

    Directory of Open Access Journals (Sweden)

    Jovina dÁvila Bordoni

    2016-10-01

    Full Text Available The study assesses the existence of certainty in the scientific process, it seeks the truth, however, faced with the unknown, causes uncertainties and doubts. We used the bibliographical research, in which it systematized the scientific literature on epistemology and knowledge related to the scientific process and the uncertainties that surround him. The scientific process, though continuously seeks the truth, will not attain perfection, because the researcher deals with the unknown. The science seeks constantly new knowledge and progress with the criticism of the mistakes, seeks the truth, however these are provisional. It is concluded that all scientific knowledge is uncertain.

  18. Designing scientific applications on GPUs

    CERN Document Server

    Couturier, Raphael

    2013-01-01

    Many of today's complex scientific applications now require a vast amount of computational power. General purpose graphics processing units (GPGPUs) enable researchers in a variety of fields to benefit from the computational power of all the cores available inside graphics cards.Understand the Benefits of Using GPUs for Many Scientific ApplicationsDesigning Scientific Applications on GPUs shows you how to use GPUs for applications in diverse scientific fields, from physics and mathematics to computer science. The book explains the methods necessary for designing or porting your scientific appl

  19. PSI Scientific report 2009

    International Nuclear Information System (INIS)

    Piwnicki, P.

    2010-04-01

    This annual report issued by the Paul Scherrer Institute (PSI) in Switzerland takes a look at work done at the institute in the year 2009. In particular, the SwissFEL X-ray Laser facility that will allow novel investigations of femtosecond molecular dynamics in chemical, biochemical and condensed-matter systems and permit coherent diffraction imaging of individual nanostructures is commented on. Potential scientific applications of the SwissFEL are noted. Further, the institute's research focus and its findings are commented on. Synchrotron light is looked at and results obtained using neutron scattering and muon spin resonance are reported on. Work done in the micro and nano-technology, biomolecular research and radiopharmacy areas is also reported on Work performed in the biology, general energy and environmental sciences area is also reported on. The institute's comprehensive research facilities are reviewed and the facilities provided for users from the national and international scientific community, in particular regarding condensed matter, materials science and biology research are noted. In addition to the user facilities at the accelerators, other PSI laboratories are also open to external users, e.g. the Hot Laboratory operated by the Nuclear Energy and Safety Department that allows experiments to be performed on highly radioactive samples. The Technology Transfer Office at PSI is also reported on. This department assists representatives from industry in their search for opportunities and sources of innovation at the PSI. Further, an overview is presented of the people who work at the PSI, how the institute is organised and how the money it receives is distributed and used. Finally, a comprehensive list of publications completes the report

  20. Effect of pre-rigor stretch and various constant temperatures on the rate of post-mortem pH fall, rigor mortis and some quality traits of excised porcine biceps femoris muscle strips.

    Science.gov (United States)

    Vada-Kovács, M

    1996-01-01

    Porcine biceps femoris strips of 10 cm original length were stretched by 50% and fixed within 1 hr post mortem then subjected to temperatures of 4 °, 15 ° or 36 °C until they attained their ultimate pH. Unrestrained control muscle strips, which were left to shorten freely, were similarly treated. Post-mortem metabolism (pH, R-value) and shortening were recorded; thereafter ultimate meat quality traits (pH, lightness, extraction and swelling of myofibrils) were determined. The rate of pH fall at 36 °C, as well as ATP breakdown at 36 and 4 °C, were significantly reduced by pre-rigor stretch. The relationship between R-value and pH indicated cold shortening at 4 °C. Myofibrils isolated from pre-rigor stretched muscle strips kept at 36 °C showed the most severe reduction of hydration capacity, while paleness remained below extreme values. However, pre-rigor stretched myofibrils - when stored at 4 °C - proved to be superior to shortened ones in their extractability and swelling.