WorldWideScience

Sample records for rigorous content standards

  1. Paper 3: Content and Rigor of Algebra Credit Recovery Courses

    Science.gov (United States)

    Walters, Kirk; Stachel, Suzanne

    2014-01-01

    This paper describes the content, organization and rigor of the f2f and online summer algebra courses that were delivered in summers 2011 and 2012. Examining the content of both types of courses is important because research suggests that algebra courses with certain features may be better than others in promoting success for struggling students.…

  2. The National Teaching Standard: Route to Rigor Mortis.

    Science.gov (United States)

    McNeil, John D.

    The actions of the National Board for Professional Teaching Standards with regard to national teaching standards and an associated examination are critiqued. The board was established on the basis of a recommendation by an advisory council of the Carnegie Forum on Education and the Economy. The board, which is composed of politicians, business…

  3. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  4. Disciplining Bioethics: Towards a Standard of Methodological Rigor in Bioethics Research

    Science.gov (United States)

    Adler, Daniel; Shaul, Randi Zlotnik

    2012-01-01

    Contemporary bioethics research is often described as multi- or interdisciplinary. Disciplines are characterized, in part, by their methods. Thus, when bioethics research draws on a variety of methods, it crosses disciplinary boundaries. Yet each discipline has its own standard of rigor—so when multiple disciplinary perspectives are considered, what constitutes rigor? This question has received inadequate attention, as there is considerable disagreement regarding the disciplinary status of bioethics. This disagreement has presented five challenges to bioethics research. Addressing them requires consideration of the main types of cross-disciplinary research, and consideration of proposals aiming to ensure rigor in bioethics research. PMID:22686634

  5. Data content standards in Africa

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2005-04-01

    Full Text Available Data content standards tend to be more accessible. Easier to understand. Used directly by many end users. Immediately applicable to Africa. More susceptible to culture and language – Hence, more important to have local standards...

  6. New tools for Content Innovation and data sharing: Enhancing reproducibility and rigor in biomechanics research.

    Science.gov (United States)

    Guilak, Farshid

    2017-03-21

    We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Standards and Methodological Rigor in Pulmonary Arterial Hypertension Preclinical and Translational Research.

    Science.gov (United States)

    Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien

    2018-03-30

    Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from

  8. Spatial data content standards for Africa

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2005-11-01

    Full Text Available , they selected 14 standards containing data dictionaries or feature catalogues, and compared their feature types. They have also provided some advice and recommendations on data content standards (particularly for data dictionaries and feature catalogues...

  9. Construct Validation of Content Standards for Teaching

    Science.gov (United States)

    van der Schaaf, Marieke F.; Stokking, Karel M.

    2011-01-01

    Current international demands to strengthen the teaching profession have led to an increased development and use of professional content standards. The study aims to provide insight in the construct validity of content standards by researching experts' underlying assumptions and preferences when participating in a delphi method. In three rounds 21…

  10. Standards-based Content Resources: A Prerequisite for Content Integration and Content Interoperability

    Directory of Open Access Journals (Sweden)

    Christian Galinski

    2010-05-01

    Full Text Available Objective: to show how standards-based approaches for content standardization, content management, content related services and tools as well as the respective certification systems not only guarantee reliable content integration and content interoperability, but also are of particular benefit to people with special needs in eAccessibility/eInclusion. Method: document MoU/MG/05 N0221 ''Semantic Interoperability and the need for a coherent policy for a framework of distributed, possibly federated repositories for all kinds of content items on a world-wide scale''2, which was adopted in 2005, was a first step towards the formulation of global interoperability requirements for structured content. These requirements -based on advanced terminological principles- were taken up in EU-projects such as IN-SAFETY (INfrastructure and SAFETY and OASIS (Open architecture for Accessible Services Integration and Standardization. Results: Content integration and content interoperability are key concepts in connection with the emergence of state-of-the-art distributed and federated databases/repositories of structured content. Given the fact that linguistic content items are increasingly combined with or embedded in non-linguistic content items (and vice versa, a systemic and generic approach to data modelling and content management has become the order of the day. Fulfilling the requirements of capability for multilinguality and multimodality, based on open standards makes software and database design fit for eAccessibility/eInclusion from the outset. It also makes structured content capable for global content integration and content interoperability, because it enhances its potential for being re-used and re-purposed in totally different eApplications. Such content as well as the methods, tools and services applied can be subject to new kinds of certification schemes which also should be based on standards. Conclusions: Content must be totally reliable in some

  11. Refreshing the "Voluntary National Content Standards in Economics"

    Science.gov (United States)

    MacDonald, Richard A.; Siegfried, John J.

    2012-01-01

    The second edition of the "Voluntary National Content Standards in Economics" was published by the Council for Economic Education in 2010. The authors examine the process for revising these precollege content standards and highlight several changes that appear in the new document. They also review the impact the standards have had on precollege…

  12. Standards for Radiation Effects Testing: Ensuring Scientific Rigor in the Face of Budget Realities and Modern Device Challenges

    Science.gov (United States)

    Lauenstein, J M.

    2015-01-01

    An overview is presented of the space radiation environment and its effects on electrical, electronic, and electromechanical parts. Relevant test standards and guidelines are listed. Test standards and guidelines are necessary to ensure best practices, minimize and bound systematic and random errors, and to ensure comparable results from different testers and vendors. Test standards are by their nature static but exist in a dynamic environment of advancing technology and radiation effects research. New technologies, failure mechanisms, and advancement in our understanding of known failure mechanisms drive the revision or development of test standards. Changes to standards must be weighed against their impact on cost and existing part qualifications. There must be consensus on new best practices. The complexity of some new technologies exceeds the scope of existing test standards and may require development of a guideline specific to the technology. Examples are given to illuminate the value and limitations of key radiation test standards as well as the challenges in keeping these standards up to date.

  13. Stochastic Inversion of Geomagnetic Observatory Data Including Rigorous Treatment of the Ocean Induction Effect With Implications for Transition Zone Water Content and Thermal Structure

    Science.gov (United States)

    Munch, F. D.; Grayver, A. V.; Kuvshinov, A.; Khan, A.

    2018-01-01

    In this paper we estimate and invert local electromagnetic (EM) sounding data for 1-D conductivity profiles in the presence of nonuniform oceans and continents to most rigorously account for the ocean induction effect that is known to strongly influence coastal observatories. We consider a new set of high-quality time series of geomagnetic observatory data, including hitherto unused data from island observatories installed over the last decade. The EM sounding data are inverted in the period range 3-85 days using stochastic optimization and model exploration techniques to provide estimates of model range and uncertainty. The inverted conductivity profiles are best constrained in the depth range 400-1,400 km and reveal significant lateral variations between 400 km and 1,000 km depth. To interpret the inverted conductivity anomalies in terms of water content and temperature, we combine laboratory-measured electrical conductivity of mantle minerals with phase equilibrium computations. Based on this procedure, relatively low temperatures (1200-1350°C) are observed in the transition zone (TZ) underneath stations located in Southern Australia, Southern Europe, Northern Africa, and North America. In contrast, higher temperatures (1400-1500°C) are inferred beneath observatories on islands, Northeast Asia, and central Australia. TZ water content beneath European and African stations is ˜0.05-0.1 wt %, whereas higher water contents (˜0.5-1 wt %) are inferred underneath North America, Asia, and Southern Australia. Comparison of the inverted water contents with laboratory-constrained water storage capacities suggests the presence of melt in or around the TZ underneath four geomagnetic observatories in North America and Northeast Asia.

  14. Assessing the Genetics Content in the Next Generation Science Standards.

    Directory of Open Access Journals (Sweden)

    Katherine S Lontok

    Full Text Available Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM. Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS using a consensus list of American Society of Human Genetics (ASHG core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  15. Assessing the Genetics Content in the Next Generation Science Standards.

    Science.gov (United States)

    Lontok, Katherine S; Zhang, Hubert; Dougherty, Michael J

    2015-01-01

    Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM). Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS) using a consensus list of American Society of Human Genetics (ASHG) core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs) included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  16. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics-Three Decades of High-Quality, Technically-Rigorous Normalization

    Science.gov (United States)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high-quality, technically-rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards contain testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards are used to generate accurate, reliable, repeatable and complete data. Within Committee C28, users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, 50 standards since the Committee's founding in 1986. This paper provides a detailed retrospective of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of standards for advanced ceramics to demonstrate their practical applications.

  17. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics, Three Decades of High-quality, Technically-rigorous Normalization

    Science.gov (United States)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high quality, rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards provide accurate, reliable, repeatable and complete data. Within Committee C28 users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, nearly 50 standards since the Committees founding in 1986. This paper provides a retrospective review of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of advanced ceramics standards to demonstrate their practical applications.

  18. Interpreting the ASTM 'content standard for digital geospatial metadata'

    Science.gov (United States)

    Nebert, Douglas D.

    1996-01-01

    ASTM and the Federal Geographic Data Committee have developed a content standard for spatial metadata to facilitate documentation, discovery, and retrieval of digital spatial data using vendor-independent terminology. Spatial metadata elements are identifiable quality and content characteristics of a data set that can be tied to a geographic location or area. Several Office of Management and Budget Circulars and initiatives have been issued that specify improved cataloguing of and accessibility to federal data holdings. An Executive Order further requires the use of the metadata content standard to document digital spatial data sets. Collection and reporting of spatial metadata for field investigations performed for the federal government is an anticipated requirement. This paper provides an overview of the draft spatial metadata content standard and a description of how the standard could be applied to investigations collecting spatially-referenced field data.

  19. Vocational High School Effectiveness Standard ISO 9001: 2008 for Achievement Content Standards, Standard Process and Competency Standards Graduates

    Directory of Open Access Journals (Sweden)

    Yeni Ratih Pratiwi

    2014-06-01

    Full Text Available Efektivitas Sekolah Menengah Kejuruan Berstandar ISO 9001:2008 terhadap Pencapaian Standar Isi, Standar Proses dan Standar Kompetensi Lulusan Abstract: The purpose of this study was to determine differences in the effectiveness of the achievement of the content standards, process standards, and competency standards in vocational already standard ISO 9001: 2008 with CMS that has not been standardized ISO 9001: 2008 both in public schools and private schools. Data collection using the questionnaire enclosed Likert scale models. Analysis of data using one-way ANOVA using SPSS. The results showed: (1 there is no difference in effectiveness between public SMK ISO standard ISO standards with private SMK (P = 0.001; (2 there are differences in the effectiveness of public SMK SMK ISO standards with ISO standards have not (P = 0.000; (3 there are differences in the effectiveness of public SMK ISO standards with private vocational yet ISO standards (P = 0.000; (4 there are differences in the effectiveness of the private vocational school with vocational standard ISO standard ISO country has not (P = 0.015; (5 there are differences in the effectiveness of the private vocational bertandar ISO with private vocational yet standardized ISO (P = 0.000; (6 there was no difference in the effectiveness of public SMK has not been standardized by the ISO standard ISO private SMK yet. Key Words: vocational high school standards ISO 9001: 2008, the standard content, process standards, competency standards Abstrak: Tujuan penelitian ini untuk mengetahui perbedaan efektivitas pencapaian standar isi, standar proses, dan standar kompetensi lulusan pada SMK yang sudah berstandar ISO 9001:2008 dengan SMK yang belum berstandar ISO 9001:2008 baik pada sekolah negeri maupun sekolah swasta. Pengumpulan data menggunakan kuisioner tertutup model skala likert. Analisis data menggunakan ANOVA one way menggunakan program SPSS. Hasil penelitian menunjukkan: (1 ada perbedaan

  20. Development of Extended Content Standards for Biodiversity Data

    Science.gov (United States)

    Hugo, Wim; Schmidt, Jochen; Saarenmaa, Hannu

    2013-04-01

    Interoperability in the field of Biodiversity observation has been strongly driven by the development of a number of global initiatives (GEO, GBIF, OGC, TDWG, GenBank, …) and its supporting standards (OGC-WxS, OGC-SOS, Darwin Core (DwC), NetCDF, …). To a large extent, these initiatives have focused on discoverability and standardization of syntactic and schematic interoperability. Semantic interoperability is more complex, requiring development of domain-dependent conceptual data models, and extension of these models with appropriate ontologies (typically manifested as controlled vocabularies). Biodiversity content has been standardized partly, for example through Darwin Core for occurrence data and associated taxonomy, and through Genbank for genetic data, but other contexts of biodiversity observation have lagged behind - making it difficult to achieve semantic interoperability between distributed data sources. With this in mind, WG8 of GEO BON (charged with data and systems interoperability) has started a work programme to address a number of concerns, one of which is the gap in content standards required to make Biodiversity data truly interoperable. The paper reports on the framework developed by WG8 for the classification of Biodiversity observation data into 'families' of use cases and its supporting data schema, where gaps, if any, in the availability if content standards have been identified, and how these are to be addressed by way of an abstract data model and the development of associated content standards. It is proposed that a minimum set of standards (1) will be required to address the scope of Biodiversity content, aligned with levels and dimensions of observation, and based on the 'Essential Biodiversity Variables' (2) being developed by GEO BON . The content standards are envisaged as loosely separated from the syntactic and schematic standards used for the base data exchange: typically, services would offer an existing data standard (DwC, WFS

  1. RDA: a content standard to ensure the quality of data

    Directory of Open Access Journals (Sweden)

    Carlo Bianchini

    2016-05-01

    Full Text Available RDA Resource Description and Access are guidelines for description and access to resources designed for digital environment and released, in its first version, in 2010. RDA is based on FRBR and its derived models, that focus on users’ needs and on resources of any kind of content, medium and carrier.  The paper discusses relevance of main features of RDA for the future role of libraries in the context of semantic web and metadata creation and exchange. The paper aims to highlight many consequences deriving from RDA being a content standard, and in particular the change from record management to data management, differences among the two functions realized by RDA (to identify and to relate entities and functions realized by other standard such as MARC21 (to archive data and ISB (to visualize data and show how, as all these functions are necessary for the catalog, RDA needs to be integrated by other rules and standard and that these tools allow the fulfilment of the variation principle defined by S.R. Ranganathan.

  2. Does the sequence of onset of rigor mortis depend on the proportion of muscle fibre types and on intra-muscular glycogen content?

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Nakajima, M; Saka, K; Iwase, H; Nagao, M; Niijima, H; Matsuda, Y

    1999-01-01

    We examined the postmortem changes in the levels of ATP, glycogen and lactic acid in two masticatory muscles and three leg muscles of rats. The proportion of fibre types of the muscles was determined with NIH image software. The ATP levels in the white muscles did not decrease up to 1 h after death, and the ATP levels 1 and 2 h after death in the white muscles were higher than those in the red muscles with a single exception. The glycogen level at death and 1 h after death and the lactic acid level 1 h after death in masticatory muscles were lower than in the leg muscles. It is possible that the differences in the proportion of muscle fibre types and in glycogen level in muscles influences the postmortem change in ATP and lactic acid, which would accelerate or retard rigor mortis of the muscles.

  3. ASK Standards: Assessment, Skills, and Knowledge Content Standards for Student Affairs Practitioners and Scholars

    Science.gov (United States)

    ACPA College Student Educators International, 2011

    2011-01-01

    The Assessment Skills and Knowledge (ASK) standards seek to articulate the areas of content knowledge, skill and dispositions that student affairs professionals need in order to perform as practitioner-scholars to assess the degree to which students are mastering the learning and development outcomes the professionals intend. Consistent with…

  4. International Longitudinal Paediatric Reference Standards for Bone Mineral Content

    Science.gov (United States)

    Baxter-Jones, Adam DG; McKay, Heather; Burrows, Melonie; Bachrach, Laura K; Lloyd, Tom; Petit, Moira; Macdonald, Heather; Mirwald, Robert L; Bailey, Don

    2014-01-01

    To render a diagnosis pediatricians rely upon reference standards for bone mineral density or bone mineral content, which are based on cross-sectional data from a relatively small sample of children. These standards are unable to adequately represent growth in a diverse pediatric population. Thus, the goal of this study was to develop sex and site specific standards for BMC using longitudinal data collected from four international sites in Canada and the United States. Data from four studies were combined; Saskatchewan Paediatric Bone Mineral Accrual Study (n=251), UBC Healthy Bones Study (n=382); Penn State Young Women’s Health Study (n=112) and Stanford’s Bone Mineral Accretion study (n=423). Males and females (8 to 25 years) were measured for whole body (WB), total proximal femur (PF), femoral neck (FN) and lumbar spine (LS) BMC (g). Data were analyzed using random effects models. Bland-Altman was used to investigate agreement in predicted and actual data. Age, height, weight and ethnicity independently predicted BMC accrual across sites (P accrual; Hispanic 75.4 (28.2) g less BMC accrual; Blacks 82.8 (26.3) g more BMC accrual with confounders of age, height and weight controlled. Similar findings were found for PF and FN. Female models for all sites were similar with age, height and weight all independent significant predictors of BMC accrual (P accounting for age, size, sex and ethnicity. In conclusion, when interpreting BMC in paediatrics we recommend standards that are sex, age, size and ethnic specific. PMID:19854308

  5. International longitudinal pediatric reference standards for bone mineral content.

    Science.gov (United States)

    Baxter-Jones, Adam D G; Burrows, Melonie; Bachrach, Laura K; Lloyd, Tom; Petit, Moira; Macdonald, Heather; Mirwald, Robert L; Bailey, Don; McKay, Heather

    2010-01-01

    To render a diagnosis pediatricians rely upon reference standards for bone mineral density or bone mineral content, which are based on cross-sectional data from a relatively small sample of children. These standards are unable to adequately represent growth in a diverse pediatric population. Thus, the goal of this study was to develop sex and site-specific standards for BMC using longitudinal data collected from four international sites in Canada and the United States. Data from four studies were combined; Saskatchewan Paediatric Bone Mineral Accrual Study (n=251), UBC Healthy Bones Study (n=382); Penn State Young Women's Health Study (n=112) and Stanford's Bone Mineral Accretion study (n=423). Males and females (8 to 25 years) were measured for whole body (WB), total proximal femur (PF), femoral neck (FN) and lumbar spine (LS) BMC (g). Data were analyzed using random effects models. Bland-Altman was used to investigate agreement between predicted and actual data. Age, height, weight and ethnicity independently predicted BMC accrual across sites (Paccrual; Hispanic 75.4 (28.2) g less BMC accrual; Blacks 82.8 (26.3) g more BMC accrual with confounders of age, height and weight controlled. We report similar findings for the PF and FN. Models for females for all sites were similar with age, height and weight as independent significant predictors of BMC accrual (Paccounting for age, size, sex and ethnicity. In conclusion, when interpreting BMC in pediatrics we recommend standards that are sex, age, size and ethnic specific. Copyright (c) 2009 Elsevier Inc. All rights reserved.

  6. Scientific rigor through videogames.

    Science.gov (United States)

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. 76 FR 65544 - Standard Format and Content of License Applications for Mixed Oxide Fuel Fabrication Facilities

    Science.gov (United States)

    2011-10-21

    ... NUCLEAR REGULATORY COMMISSION [NRC-2009-0323] Standard Format and Content of License Applications... revision to regulatory guide (RG) 3.39, ``Standard Format and Content of License Applications for Mixed Oxide Fuel Fabrication Facilities.'' This guide endorses the standard format and content for license...

  8. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  9. 45 CFR 170.205 - Content exchange standards and implementation specifications for exchanging electronic health...

    Science.gov (United States)

    2010-10-01

    .... The Healthcare Information Technology Standards Panel (HITSP) Summary Documents Using HL7 CCD... HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS... TECHNOLOGY Standards and Implementation Specifications for Health Information Technology § 170.205 Content...

  10. 77 FR 75198 - Standard Format and Content for Post-Shutdown Decommissioning Activities Report

    Science.gov (United States)

    2012-12-19

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0299] Standard Format and Content for Post-Shutdown... regulatory guide (DG), DG-1272, ``Standard Format and Content for Post-shutdown Decommissioning Activities... Content for Post-shutdown Decommissioning Activities Report,'' which was issued in July 2000. DG-1271...

  11. The Relationship Between State and District Content Standards:Issues of Alignment, Influence and Utility

    Directory of Open Access Journals (Sweden)

    Elizabeth Dutro

    2004-08-01

    Full Text Available At the core of standards-based reform are content standards--statements about what students should know and be able to do. Although it is state standards that are the focus of much public attention and consume substantial resources, many local school districts have developed their own content standards in the major subject areas. However, we know very little about the role state standards have played in local standards efforts. In this article we report on a study of the relationship between state and local content standards in reading in four states and districts. Through interviews with key personnel in each state, and district and analyses of state and local content standards in reading, we explored the alignment between state and district content standards, the path of influence between the two, and the role of high-stakes tests in state and districts reform efforts. Our findings suggest that alignment had multiple meanings and that state standards had differential utility to districts, ranging from helpful to benign to nuisance. This wide variability was influenced by the nature of the standards themselves, the state vision of alignment and local control, districts’ own engagement and commitment to professional development, and student performance on high-stakes tests. We explore implications for the future of content standards as the cornerstone of standards-based reform and argue that states must promote district ownership and expand accountability if state content standards are to have any relevance for local efforts to reform teaching and learning.

  12. Guidelines on Active Content and Mobile Code: Recommendations of the National Institute of Standards and Technology

    National Research Council Canada - National Science Library

    Jansen, Wayne

    2001-01-01

    .... One such category of technologies is active content. Broadly speaking, active content refers to electronic documents that, unlike past character documents based on the American Standard Code for Information Interchange (ASCII...

  13. Measuring fuel moisture content in Alaska: standard methods and procedures.

    Science.gov (United States)

    Rodney A. Norum; Melanie. Miller

    1984-01-01

    Methods and procedures are given for collecting and processing living and dead plant materials for the purpose of determining their water content. Wild-land fuels in Alaska are emphasized, but the methodology is applicable elsewhere. Guides are given for determining the number of samples needed to attain a chosen precision. Detailed procedures are presented for...

  14. Standardizing consumer’s expectations in digital content

    NARCIS (Netherlands)

    Helberger, N.

    2011-01-01

    Purpose - The purpose of this paper is to make suggestions of how to improve the legal standing of consumers of digital content products. Design/methodology/approach - The analysis in this paper is based on desk research and comparative legal research, among others in the context of research

  15. Standard Test Methods for Constituent Content of Composite Materials

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2009-01-01

    1.1 These test methods determine the constituent content of composite materials by one of two approaches. Method I physically removes the matrix by digestion or ignition by one of seven procedures, leaving the reinforcement essentially unaffected and thus allowing calculation of reinforcement or matrix content (by weight or volume) as well as percent void volume. Method II, applicable only to laminate materials of known fiber areal weight, calculates reinforcement or matrix content (by weight or volume), and the cured ply thickness, based on the measured thickness of the laminate. Method II is not applicable to the measurement of void volume. 1.1.1 These test methods are primarily intended for two-part composite material systems. However, special provisions can be made to extend these test methods to filled material systems with more than two constituents, though not all test results can be determined in every case. 1.1.2 The procedures contained within have been designed to be particularly effective for ce...

  16. National Standards for Financial Literacy: Rationale and Content

    Science.gov (United States)

    Bosshardt, William; Walstad, William B.

    2014-01-01

    The "National Standards for Financial Literacy" describe the knowledge, understanding, and skills that are important for students to learn about personal finance. They are designed to guide teachers, school administrators, and other educators in developing curriculum and educational materials for teaching financial literacy. In this…

  17. 78 FR 38739 - Standard Format and Content for Post-Shutdown Decommissioning Activities Report

    Science.gov (United States)

    2013-06-27

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0299] Standard Format and Content for Post-Shutdown Decommissioning Activities Report AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory guide; issuance..., ``Standard Format and Content for Post-shutdown Decommissioning Activities Report.'' This guide describes a...

  18. 76 FR 59173 - Standard Format and Content of License Applications for Conventional Uranium Mills

    Science.gov (United States)

    2011-09-23

    ... NUCLEAR REGULATORY COMMISSION [NRC-2008-0302] Standard Format and Content of License Applications for Conventional Uranium Mills AGENCY: Nuclear Regulatory Commission. ACTION: Draft regulatory guide..., ``Standard Format and Content of License Applications for Conventional Uranium Mills.'' DG- 3024 was a...

  19. Developing content standards for teaching research skills using a delphi method

    NARCIS (Netherlands)

    Schaaf, M.F. van der; Stokking, K.M.; Verloop, N.

    2005-01-01

    The increased attention for teacher assessment and current educational reforms ask for procedures to develop adequate content standards. For the development of content standards on teaching research skills, a Delphi method based on stakeholders’ judgments has been designed and tested. In three

  20. Recommendation of standardized health learning contents using archetypes and semantic web technologies.

    Science.gov (United States)

    Legaz-García, María del Carmen; Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2012-01-01

    Linking Electronic Healthcare Records (EHR) content to educational materials has been considered a key international recommendation to enable clinical engagement and to promote patient safety. This would suggest citizens to access reliable information available on the web and to guide them properly. In this paper, we describe an approach in that direction, based on the use of dual model EHR standards and standardized educational contents. The recommendation method will be based on the semantic coverage of the learning content repository for a particular archetype, which will be calculated by applying semantic web technologies like ontologies and semantic annotations.

  1. Methodological Choices in the Content Analysis of Textbooks for Measuring Alignment with Standards

    Science.gov (United States)

    Polikoff, Morgan S.; Zhou, Nan; Campbell, Shauna E.

    2015-01-01

    With the recent adoption of the Common Core standards in many states, there is a need for quality information about textbook alignment to standards. While there are many existing content analysis procedures, these generally have little, if any, validity or reliability evidence. One exception is the Surveys of Enacted Curriculum (SEC), which has…

  2. An Analysis of Geography Content in Relation to Geography for Life Standards in Oman

    Science.gov (United States)

    Al-Nofli, Mohammed Abdullah

    2018-01-01

    Since the publication of "Geography for Life: National Geography Standards" in the United States (Geography Education Standards Project, 1994), it has been widely used to develop quality curriculum materials for what students should know and able to do in geography. This study compared geography content taught in Omani public schools…

  3. Pre-Calculus California Content Standards: Standards Deconstruction Project. Version 1.0

    Science.gov (United States)

    Arnold, Bruce; Cliffe, Karen; Cubillo, Judy; Kracht, Brenda; Leaf, Abi; Legner, Mary; McGinity, Michelle; Orr, Michael; Rocha, Mario; Ross, Judy; Teegarden, Terrie; Thomson, Sarah; Villero, Geri

    2008-01-01

    This project was coordinated and funded by the California Partnership for Achieving Student Success (Cal-PASS). Cal-PASS is a data sharing system linking all segments of education. Its purpose is to improve student transition and success from one educational segment to the next. Cal-PASS' standards deconstruction project was initiated by the…

  4. Energy content estimation by collegians for portion standardized foods frequently consumed in Korea.

    Science.gov (United States)

    Kim, Jin; Lee, Hee Jung; Lee, Hyun Jung; Lee, Sun Ha; Yun, Jee-Young; Choi, Mi-Kyeong; Kim, Mi-Hyun

    2014-01-01

    The purpose of this study is to estimate Korean collegians' knowledge of energy content in the standard portion size of foods frequently consumed in Korea and to investigate the differences in knowledge between gender groups. A total of 600 collegians participated in this study. Participants' knowledge was assessed based on their estimation on the energy content of 30 selected food items with their actual-size photo images. Standard portion size of food was based on 2010 Korean Dietary Reference Intakes, and the percentage of participants who accurately estimated (that is, within 20% of the true value) the energy content of the standard portion size was calculated for each food item. The food for which the most participants provided the accurate estimation was ramyun (instant noodles) (67.7%), followed by cooked rice (57.8%). The proportion of students who overestimated the energy content was highest for vegetables (68.8%) and beverages (68.1%). The proportion of students who underestimated the energy content was highest for grains and starches (42.0%) and fruits (37.1%). Female students were more likely to check energy content of foods that they consumed than male students. From these results, it was concluded that the knowledge on food energy content was poor among collegians, with some gender difference. Therefore, in the future, nutrition education programs should give greater attention to improving knowledge on calorie content and to helping them apply this knowledge in order to develop effective dietary plans.

  5. Putrefactive rigor: apparent rigor mortis due to gas distension.

    Science.gov (United States)

    Gill, James R; Landi, Kristen

    2011-09-01

    Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.

  6. Mathematical Rigor in Introductory Physics

    Science.gov (United States)

    Vandyke, Michael; Bassichis, William

    2011-10-01

    Calculus-based introductory physics courses intended for future engineers and physicists are often designed and taught in the same fashion as those intended for students of other disciplines. A more mathematically rigorous curriculum should be more appropriate and, ultimately, more beneficial for the student in his or her future coursework. This work investigates the effects of mathematical rigor on student understanding of introductory mechanics. Using a series of diagnostic tools in conjunction with individual student course performance, a statistical analysis will be performed to examine student learning of introductory mechanics and its relation to student understanding of the underlying calculus.

  7. Seismic Performance Comparison of a High-Content SDA Frame and Standard RC Frame

    OpenAIRE

    van de Lindt, John W.; Rechan, R. Karthik

    2011-01-01

    This study presents the method and results of an experiment to study the seismic behavior of a concrete portal frame with fifty percent of its cement content replaced with a spray dryer ash (SDA). Based on multiple-shake-table tests, the high content SDA frame was found to perform as well as the standard concrete frame for two earthquakes exceeding design-level intensity earthquakes. Hence, from a purely seismic/structural standpoint, it may be possible to replace approximately fifty percen...

  8. MGIMO Educational Standards: Goal and Contents of Professional Language Training of IR Economics Students

    Directory of Open Access Journals (Sweden)

    Alla A. Kizima

    2015-01-01

    Full Text Available The article gives a methodological analysis of MGIMO-University own education standards and programmes. The relevance of the article is explained by the necessity to define the goals and contents of professional language training of IR economics students at MGIMO-University after a transfer to own education standards. The researcher used competence-based and cultural studies approaches with reference to the didactic principles of accessibility, systematic, consistency, necessity and sufficiency. The author used a set of methods including the method of theoretical analysis, the method of synthesis and systematization, summative method. The article addresses the difference in the training of IR economists and economists in other spheres of economics, underlines the importance of professional language training of IR economics students, analyses the specifics of professional language training of IR economists from the standpoint of competence-based approach by comparing the competences presented in the Federal State Education Standards of Higher Education and MGIMO own education standards. The author gives a definition of goal and contents of professional language training of IR economics students as well as didactic principles of contents choice that define the effectiveness of training. In conclusion the author points out that the contents of professional language training of IR economics students based on MGIMO own education standards are approached as the system of professional knowledge, skills and competence leading to successful intercultural communication.

  9. Evolution of standardized procedures for adjusting lumber properties for change in moisture content

    Science.gov (United States)

    David W. Green; James W. Evans

    2001-01-01

    This paper documents the development of procedures in American Society for Testing and Materials standards for adjusting the allowable properties of lumber for changes in moisture content. The paper discusses the historical context of efforts to establish allowable properties on a consensus basis, beginning in the 19th century. Where possible, the reasons for proposed...

  10. 9 CFR 381.156 - Poultry meat content standards for certain poultry products.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Poultry meat content standards for certain poultry products. 381.156 Section 381.156 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY...

  11. The Politics of Developing and Maintaining Mathematics and Science Curriculum Content Standards. Research Monograph.

    Science.gov (United States)

    Kirst, Michael W.; Bird, Robin L.

    The movement toward math and science curriculum standards is inextricably linked with high-stakes politics. There are two major types of politics discussed in this paper: the allocation of curriculum content, and the political issues involved in systemic change. Political strategies for gaining assent to national, state, and local content…

  12. 78 FR 73566 - Standard Format and Content for a License Application for an Independent Spent Fuel Storage...

    Science.gov (United States)

    2013-12-06

    ... NUCLEAR REGULATORY COMMISSION [NRC-2013-0264] Standard Format and Content for a License...), DG-3042, ``Standard Format and Content for a License Application for an Independent Spent Fuel..., Form, and Contents,'' specifies the information that must be in an application for a license to store...

  13. METHODOLOGICAL ASPECTS OF CONTENT ANALYSIS OF CONVERGENCE BETWEEN UKRAINIAN GAAP AND INTERNATIONAL FINANCIAL REPORTING STANDARDS

    Directory of Open Access Journals (Sweden)

    R. Kuzina

    2015-06-01

    Full Text Available The objective conditions of Ukraine’s integration into the global business environment the need to strengthen the accounting and financial re-porting. At the stage of attracting investment in the country there is a need in the preparation of financial statements generally accepted basic prin-ciples of which are based on common international financial reporting standards (IFRS . Relevant is the assessment of convergence of national standards and International Financial Reporting Standards. However, before you conduct content analysis necessary to determine compliance with standards of methodological approaches to the selection of key indicators for the assessment of convergence. The article is to define the methodo-logical approaches to the selection and development of indicators IFRSs list of key elements for further evaluation convergence of national and international standards. To assess the convergence was allocated 187 basic key elements measuring the level of convergence to IFRS. Sampling was carried out based on the professional judgment of the author, the key indicators of the standard, based on the evaluation of the usefulness of accounting information. These figures make it possible to calculate the specific level of convergence of international and national standards and determine how statements prepared by domestic standards corresponding to IFRS. In other words, can with some certainty assert that Ukraine has made (“good practices in IFRS implementation” or not? This calculation will assess the regulatory efforts of government agencies (Ministry of Finance on the approximation of Ukrainian standards and IFRS.

  14. ATP, IMP, and glycogen in cod muscle at onset and during development of rigor mortis depend on the sampling location

    DEFF Research Database (Denmark)

    Cappeln, Gertrud; Jessen, Flemming

    2002-01-01

    Variation in glycogen, ATP, and IMP contents within individual cod muscles were studied in ice stored fish during the progress of rigor mortis. Rigor index was determined before muscle samples for chemical analyzes were taken at 16 different positions on the fish. During development of rigor......, the contents of glycogen and ATP decreased differently in relation to rigor index depending on sampling location. Although fish were considered to be in strong rigor according to the rigor index method, parts of the muscle were not in rigor as high ATP concentrations were found in dorsal and tall muscle....

  15. A case of instantaneous rigor?

    Science.gov (United States)

    Pirch, J; Schulz, Y; Klintschar, M

    2013-09-01

    The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.

  16. Preliminary Study on the Standard of Selenium Content in Agricultural Products

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhi-yuan; YOU Yong; GUO Qing-quan; WANG Yong-hong; DENG Shi-lin

    2012-01-01

    With the improvement of living standards, people pay more attention to the agricultural products with health protection function, and the selenium-rich agricultural products attract more and more consumers. The main biological role of selenium is to resist oxidation and inflammatory response, mainly focusing on resisting aging, preventing cardiovascular disease, protecting eyesight, counteracting or destroying the toxic properties, preventing cancer and thyroid disease. In most areas of China, there is a widespread shortage of selenium, thus producing selenium-rich agricultural products to provide natural selenium-rich health food to the areas in need of selenium, has gradually become a new hot spot of China’s health food industry, but high content of selenium in food is detrimental to human body, even leads to selenium intoxication, and artificially adding inorganic selenium is difficult to guarantee that the selenium content of agricultural products is not exceeded. According to human body’s daily demand for selenium in dietetics and the content of selenium in agricultural products in the Chinese food composition table, we put forward the recommendations on the standard of selenium in agricultural products, in order to provide the basis for China to formulate the health standard of selenium content in selenium-rich agricultural products.

  17. Development of format and contents of safety analysis report for the KNGR standard design

    International Nuclear Information System (INIS)

    Lee, J. H.; Kim, H. S.; Yun, Y. K. and others

    1999-01-01

    Referring to the USNRC Regulatory Guide 1.70 which has been used in the preparation of the SAR for conventional nuclear power plants, the draft guide for format and contents of the SAR for the KNGR standard design was developed based on new regulatory information related to advanced reactors. The draft guide will enable the regulator to make an effective and consistent review on the safety of the KNGR, when this draft guide is used, since the draft guide requires more specific and additional safety information for the standardized NPPs than RG 1.70. In addition, it is expected that the guide for the format and contents of the COL's SAR will be more easily developed using the draft guide suggested in this report. Also, the draft guide can serve as the Korean national guide, with the exception to some industry codes and standards. The experts' review will be performed during the next stage of the project to ensure the objectivity and consistency of the draft guide developed in this study. After reflecting the experts' comments in the guide and revising the contents, it will be utilized in the licensing activities for the KNGR standard design

  18. Wikis: Developing pre-service teachers’ leadership skills and knowledge of content standards

    Directory of Open Access Journals (Sweden)

    Angelia Reid-Griffin

    2016-03-01

    Full Text Available In this initial phase of our multi-year research study we set out to explore the development of leadership skills in our pre-service secondary teachers after using an online wiki, Wikispaces. This paper presents our methods for preparing a group of 13 mathematics and 3 science secondary pre-service teachers to demonstrate the essential knowledge, skills and dispositions of beginning teacher leaders. Our findings indicate the pre-service teachers' overall satisfaction with demonstrating leadership through collaborative practices. They were successful in these new roles as teacher/collaborator within the context of communication about content standards. Though the candidates participated in other collaborative tasks, this effort was noted for bringing together technology, content standards and leadership qualities that are critical for beginning teachers. Implications for addressing the preservice teachers' development of leadership skills, as they become professional teachers will be shared.

  19. 21 CFR 130.10 - Requirements for foods named by use of a nutrient content claim and a standardized term.

    Science.gov (United States)

    2010-04-01

    ... standardized term. (a) Description. The foods prescribed by this general definition and standard of identity... of identity but that do not comply with the standard of identity because of a deviation that is.... Deviations from noningredient provisions of the standard of identity (e.g., moisture content, food solids...

  20. Regulating web content: the nexus of legislation and performance standards in the United Kingdom and Norway.

    Science.gov (United States)

    Giannoumis, G Anthony

    2014-01-01

    Despite different historical traditions, previous research demonstrates a convergence between regulatory approaches in the United Kingdom and Norway. To understand this convergence, this article examines how different policy traditions influence the legal obligations of performance standards regulating web content for use by persons with disabilities. While convergence has led to similar policy approaches, I argue that national policy traditions have an impact on how governments establish legal obligations for standards compliance. The analysis reveals that national policy traditions influenced antidiscrimination legislation and the capacity and authority of regulatory agencies, which impacted the diverging legal obligations of standards in the United Kingdom and Norway. The analysis further suggests that policy actors mediate the reciprocal influence between national policy traditions and regulatory convergence mechanisms. Copyright © 2014 John Wiley & Sons, Ltd.

  1. A Comparison of Higher-Order Thinking between the Common Core State Standards and the 2009 New Jersey Content Standards in High School

    Science.gov (United States)

    Sforza, Dario; Tienken, Christopher H.; Kim, Eunyoung

    2016-01-01

    The creators and supporters of the Common Core State Standards claim that the Standards require greater emphasis on higher-order thinking than previous state standards in mathematics and English language arts. We used a qualitative case study design with content analysis methods to test the claim. We compared the levels of thinking required by the…

  2. Rigor in Qualitative Supply Chain Management Research

    DEFF Research Database (Denmark)

    Goffin, Keith; Raja, Jawwad; Claes, Björn

    2012-01-01

    , reliability, and theoretical saturation. Originality/value – It is the authors' contention that the addition of the repertory grid technique to the toolset of methods used by logistics and supply chain management researchers can only enhance insights and the building of robust theories. Qualitative studies......Purpose – The purpose of this paper is to share the authors' experiences of using the repertory grid technique in two supply chain management studies. The paper aims to demonstrate how the two studies provided insights into how qualitative techniques such as the repertory grid can be made more...... rigorous than in the past, and how results can be generated that are inaccessible using quantitative methods. Design/methodology/approach – This paper presents two studies undertaken using the repertory grid technique to illustrate its application in supply chain management research. Findings – The paper...

  3. Preservice Secondary Teachers Perceptions of College-Level Mathematics Content Connections with the Common Core State Standards for Mathematics

    Science.gov (United States)

    Olson, Travis A.

    2016-01-01

    Preservice Secondary Mathematics Teachers (PSMTs) were surveyed to identify if they could connect early-secondary mathematics content (Grades 7-9) in the Common Core State Standards for Mathematics (CCSSM) with mathematics content studied in content courses for certification in secondary teacher preparation programs. Respondents were asked to…

  4. Seismic Performance Comparison of a High-Content SDA Frame and Standard RC Frame

    Directory of Open Access Journals (Sweden)

    John W. van de Lindt

    2011-01-01

    Full Text Available This study presents the method and results of an experiment to study the seismic behavior of a concrete portal frame with fifty percent of its cement content replaced with a spray dryer ash (SDA. Based on multiple-shake-table tests, the high content SDA frame was found to perform as well as the standard concrete frame for two earthquakes exceeding design-level intensity earthquakes. Hence, from a purely seismic/structural standpoint, it may be possible to replace approximately fifty percent of cement in a concrete mix with SDA for the construction of structural members in high seismic zones. This would help significantly redirect spray dryer ash away from landfills, thus, providing a sustainable greener alternative to concrete that uses only Portland cement, or only a small percentage of SDA or fly ash.

  5. Learner-Directed Nutrition Content for Medical Schools to Meet LCME Standards

    Directory of Open Access Journals (Sweden)

    Lisa A. Hark

    2015-01-01

    Full Text Available Deficiencies in medical school nutrition education have been noted since the 1960s. Nutrition-related non-communicable diseases, including heart disease, stroke, cancer, diabetes, and obesity, are now the most common, costly, and preventable health problems in the US. Training medical students to assess diet and nutritional status and advise patients about a healthy diet, exercise, body weight, smoking, and alcohol consumption are critical to reducing chronic disease risk. Barriers to improving medical school nutrition content include lack of faculty preparation, limited curricular time, and the absence of funding. Several new LCME standards provide important impetus for incorporating nutrition into existing medical school curriculum as self-directed material. Fortunately, with advances in technology, electronic learning platforms, and web-based modules, nutrition can be integrated and assessed across all four years of medical school at minimal costs to medical schools. Medical educators have access to a self-study nutrition textbook, Medical Nutrition and Disease, Nutrition in Medicine© online modules, and the NHLBI Nutrition Curriculum Guide for Training Physicians. This paper outlines how learner-directed nutrition content can be used to meet several US and Canadian LCME accreditation standards. The health of the nation depends upon future physicians’ ability to help their patients make diet and lifestyle changes.

  6. Content

    DEFF Research Database (Denmark)

    Keiding, Tina Bering

    secondary levels. In subject matter didactics, the question of content is more developed, but it is still mostly confined to teaching on lower levels. As for higher education didactics, discussions on selection of content are almost non-existent on the programmatic level. Nevertheless, teachers are forced...... curriculum, in higher education, and to generate analytical categories and criteria for selection of content, which can be used for systematic didactical reflection. The larger project also concerns reflection on and clarification of the concept of content, including the relation between content at the level......Aim, content and methods are fundamental categories of both theoretical and practical general didactics. A quick glance in recent pedagogical literature on higher education, however, reveals a strong preoccupation with methods, i.e. how teaching should be organized socially (Biggs & Tang, 2007...

  7. ESIP's Emerging Provenance and Context Content Standard Use Cases: Developing Examples and Models for Data Stewardship

    Science.gov (United States)

    Ramdeen, S.; Hills, D. J.

    2013-12-01

    Earth science data collections range from individual researchers' private collections to large-scale data warehouses, from computer-generated data to field or lab based observations. These collections require stewardship. Fundamentally, stewardship ensures long term preservation and the provision of access to the user community. In particular, stewardship includes capturing appropriate metadata and documentation--and thus the context of the data's creation and any changes they underwent over time --to enable data reuse. But scientists and science data managers must translate these ideas into practice. How does one balance the needs of current and (projected) future stakeholders? In 2011, the Data Stewardship Committee (DSC) of the Federation of Earth Science Information Partners (ESIP) began developing the Provenance and Context Content Standard (PCCS). As an emerging standard, PCCS provides a framework for 'what' must be captured or preserved as opposed to describing only 'how' it should be done. Originally based on the experiences of NASA and NOAA researchers within ESIP, the standard currently provides data managers with content items aligned to eight key categories. While the categories and content items are based on data life cycles of remote sensing missions, they can be generalized to cover a broader set of activities, for example, preservation of physical objects. These categories will include the information needed to ensure the long-term understandability and usability of earth science data products. In addition to the PCCS, the DSC is developing a series of use cases based on the perspectives of the data archiver, data user, and the data consumer that will connect theory and practice. These cases will act as specifications for developing PCCS-based systems. They will also provide for examination of the categories and content items covered in the PCCS to determine if any additions are needed to cover the various use cases, and also provide rationale and

  8. From a Content Delivery Portal to a Knowledge Management System for Standardized Cancer Documentation.

    Science.gov (United States)

    Schlue, Danijela; Mate, Sebastian; Haier, Jörg; Kadioglu, Dennis; Prokosch, Hans-Ulrich; Breil, Bernhard

    2017-01-01

    Heterogeneous tumor documentation and its challenges of interpretation of medical terms lead to problems in analyses of data from clinical and epidemiological cancer registries. The objective of this project was to design, implement and improve a national content delivery portal for oncological terms. Data elements of existing handbooks and documentation sources were analyzed, combined and summarized by medical experts of different comprehensive cancer centers. Informatics experts created a generic data model based on an existing metadata repository. In order to establish a national knowledge management system for standardized cancer documentation, a prototypical tumor wiki was designed and implemented. Requirements engineering techniques were applied to optimize this platform. It is targeted to user groups such as documentation officers, physicians and patients. The linkage to other information sources like PubMed and MeSH was realized.

  9. Realizing rigor in the mathematics classroom

    CERN Document Server

    Hull, Ted H (Henry); Balka, Don S

    2014-01-01

    Rigor put within reach! Rigor: The Common Core has made it policy-and this first-of-its-kind guide takes math teachers and leaders through the process of making it reality. Using the Proficiency Matrix as a framework, the authors offer proven strategies and practical tools for successful implementation of the CCSS mathematical practices-with rigor as a central objective. You'll learn how to Define rigor in the context of each mathematical practice Identify and overcome potential issues, including differentiating instruction and using data

  10. Comparison of Nutrient Content and Cost of Home-Packed Lunches to Reimbursable School Lunch Nutrient Standards and Prices

    Science.gov (United States)

    Johnson, Cara M.; Bednar, Carolyn; Kwon, Junehee; Gustof, Alissa

    2009-01-01

    Purpose: The purpose of this study was to compare nutrient content and cost of home-packed lunches to nutrient standards and prices for reimbursable school lunches. Methods: Researchers observed food and beverage contents of 333 home packed lunches at four north Texas elementary schools. Nutritionist Pro was used to analyze lunches for calories,…

  11. 40 CFR 80.524 - What sulfur content standard applies to motor vehicle diesel fuel downstream of the refinery or...

    Science.gov (United States)

    2010-07-01

    ... to motor vehicle diesel fuel downstream of the refinery or importer? 80.524 Section 80.524 Protection... FUELS AND FUEL ADDITIVES Motor Vehicle Diesel Fuel; Nonroad, Locomotive, and Marine Diesel Fuel; and ECA Marine Fuel Motor Vehicle Diesel Fuel Standards and Requirements § 80.524 What sulfur content standard...

  12. The Compatibility of Developed Mathematics Textbooks' Content in Saudi Arabia (Grades 6-8) with NCTM Standards

    Science.gov (United States)

    Alshehri, Mohammed Ali; Ali, Hassan Shawki

    2016-01-01

    This study aimed to investigate the compatibility of developed mathematics textbooks' content (grades 6-8) in Saudi Arabia with NCTM standards in the areas of: number and operations, algebra, geometry, measurement, data analysis and probability. To achieve that goal, a list of (NCTM) standards for grades (6-8) were translated to Arabic language,…

  13. Adding Bite to the Bark: Using LibGuides2 Migration as Impetus to Introduce Strong Content Standards

    Science.gov (United States)

    Fritch, Melia; Pitts, Joelle E.

    2016-01-01

    The authors discuss the long-term accumulation of unstandardized and inaccessible content within the Libguides system and the decision-making process to create and implement a set of standards using the migration to the LibGuides2 platform as a vehicle for change. Included in the discussion are strategies for the creation of standards and…

  14. Classroom Talk for Rigorous Reading Comprehension Instruction

    Science.gov (United States)

    Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.

    2004-01-01

    This study examined the quality of classroom talk and its relation to academic rigor in reading-comprehension lessons. Additionally, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data for this study included 21 reading-comprehension lessons in several elementary and middle schools from…

  15. Benefits of standard format and content for approval of packaging for radioactive material

    International Nuclear Information System (INIS)

    Pstrak, D.; Osgood, N.

    2004-01-01

    The U.S. Nuclear Regulatory Commission (NRC) uses Regulatory Guide 7.9, ''Standard Format and Content of Part 71 Applications for Approval of Packaging for Radioactive Material'' to provide recommendations on the preparation of applications for approval of Type B and fissile material packages. The purpose of this Regulatory Guide is to assist the applicant in preparing an application that demonstrates the adequacy of a package in meeting the 10 CFR Part 71 packaging requirements. NRC recently revised Regulatory Guide 7.9 to reflect current changes to the regulations in Part 71 as a result of a recent rulemaking that included changes to the structural, containment, and criticality requirements for packages. Overall, the NRC issues Regulatory Guides to describe methods that are acceptable to the NRC staff for implementing specific parts of the NRC's regulations, to explain techniques used by the NRC staff in evaluating specific problems, and to provide guidance to applicants. It is important to note the specific purpose of this Regulatory Guide. As the name indicates, this Guide sets forth a standard format for application submission that is acceptable to the NRC staff that, when used by the applicant, will accomplish several objectives. First, use of the guide provides a consistent and repeatable approach that indicates the information to be provided by the applicant. Second, the organization of the information in the application will assist the reviewer(s) in locating information. Ultimately, accomplishing these objectives will help to ensure the completeness of the information in the application as well as decrease the review time. From an international perspective, use of a standard format approach could enhance the efficiency with which Competent Authorities certify and validate packages for use in the packaging and transportation of radioactive material worldwide. This streamlined approach of preparing package applications could ultimately lead to uniform

  16. Status of sennosides content in various Indian herbal formulations: Method standardization by HPTLC

    Directory of Open Access Journals (Sweden)

    Md.Wasim Aktar

    2008-12-01

    Full Text Available Several poly-herbal formulations containing senna (Cassia angustifolia leaves are available in the Indian market for the treatment of constipation. The purgative effect of senna is due to the presence of two unique hydroxy anthracene glycosides sennosides A and B. A HPTLC method for the quantitative analysis of sennosides A and B present in the formulation has been developed. Methanol extract of the formulations was analyzed on a silica gel 60 GF254 HPTLC plates with spot visualization under UV and scanning at 350 nm in absorption/ reflection mode. Calibration curves were found to be linear in the range 200-1000 ηg. The correlation coefficients were found to be 0.991 for sennoside A and 0.997 for sennoside B. The average recovery rate was 95% for sennoside A and 97% for sennoside B showing the reliability and reproducibility of the method. Limit of detection and quantification were determined as 0.05 and 0.25 μg/g respectively. The validity of the method with respect to analysis was confirmed by comparing the UV spectra of the herbal formulations with that of the standard within the same Rf window. The analysis revealed a significant variation in sennosides content.

  17. Status of sennosides content in various Indian herbal formulations: Method standardization by HPTLC

    Directory of Open Access Journals (Sweden)

    Md. Wasim Aktar

    2008-06-01

    Full Text Available Several poly-herbal formulations containing senna (Cassia angustifolia leaves are available in the Indian market for the treatment of constipation. The purgative effect of senna is due to the presence of two unique hydroxy anthracene glycosides sennosides A and B. A HPTLC method for the quantitative analysis of sennosides A and B present in the formulation has been developed. Methanol extract of the formulations was analyzed on a silica gel 60 GF254 HPTLCplates with spot visualization under UV and scanning at 350 nm in absorption/ reflection mode. Calibration curves were found to be linear in the range 200-1000 ηg. The correlation coefficients were found to be 0.991 for sennoside A and 0.997 for sennoside B. The average recovery rate was 95% for sennoside A and 97% for sennoside B showing the reliability and reproducibility of the method. Limit of detection and quantification were determined as 0.05 and 0.25 μg/g respectively. The validity of the method with respect to analysis was confirmed by comparing the UV spectra of the herbal formulations with that of the standard within the same Rf window. The analysis revealed a significant variation in sennosides content.

  18. Superstrings, entropy and the elementary particles content of the standard model

    International Nuclear Information System (INIS)

    El Naschie, M.S.

    2006-01-01

    A number of interconnected issues involving superstring theory, entropy and the particle content of the standard model of high energy physics are discussed in the present work. It is found that within a non-transfinite approximation, the number of elementary particles is given by DimSU(8) in full agreement with the prediction gained from dividing the total number of the massless level of Heterotic string theory (256)(16)=8064 by the spin representation 2 7 =128 which gives DimSU(8)=(8) 2 -1=(8064)/(128)=63 particles. For the exact transfinite case however, one finds our previously established E-infinity result:N=(336+16k)(3/2+k)(16+k)/(128+8k)=α-bar o /2,where k=φ 3 (1-φ 3 ), φ=(5-1)/2 and α-bar o /2=68.54101965. Setting k=0 one finds that n=63 exactly as in the non-transfinite case

  19. Rigorous Science: a How-To Guide

    Directory of Open Access Journals (Sweden)

    Arturo Casadevall

    2016-11-01

    Full Text Available Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education.

  20. Human milk fortifier with high versus standard protein content for promoting growth of preterm infants: A meta-analysis.

    Science.gov (United States)

    Liu, Tian-Tian; Dang, Dan; Lv, Xiao-Ming; Wang, Teng-Fei; Du, Jin-Feng; Wu, Hui

    2015-06-01

    To compare the growth of preterm infants fed standard protein-fortified human milk with that containing human milk fortifier (HMF) with a higher-than-standard protein content. Published articles reporting randomized controlled trials and prospective observational intervention studies listed on the PubMed®, Embase®, CINAHL and Cochrane Library databases were searched using the keywords 'fortifier', 'human milk', 'breastfeeding', 'breast milk' and 'human milk fortifier'. The mean difference with 95% confidence intervals was used to compare the effect of HMF with a higher-than-standard protein content on infant growth characteristics. Five studies with 352 infants with birth weight ≤ 1750 g and a gestational age ≤ 34 weeks who were fed human milk were included in this meta-analysis. Infants in the experimental groups given human milk with higher-than-standard protein fortifier achieved significantly greater weight and length at the end of the study, and greater weight gain, length gain, and head circumference gain, compared with control groups fed human milk with the standard HMF. HMF with a higher-than-standard protein content can improve preterm infant growth compared with standard HMF. © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  1. Inhibition of colon carcinogenesis by a standardized Cannabis sativa extract with high content of cannabidiol.

    Science.gov (United States)

    Romano, Barbara; Borrelli, Francesca; Pagano, Ester; Cascio, Maria Grazia; Pertwee, Roger G; Izzo, Angelo A

    2014-04-15

    Colon cancer is a major public health problem. Cannabis-based medicines are useful adjunctive treatments in cancer patients. Here, we have investigated the effect of a standardized Cannabis sativa extract with high content of cannabidiol (CBD), here named CBD BDS, i.e. CBD botanical drug substance, on colorectal cancer cell proliferation and in experimental models of colon cancer in vivo. Proliferation was evaluated in colorectal carcinoma (DLD-1 and HCT116) as well as in healthy colonic cells using the MTT assay. CBD BDS binding was evaluated by its ability to displace [(3)H]CP55940 from human cannabinoid CB1 and CB2 receptors. In vivo, the effect of CBD BDS was examined on the preneoplastic lesions (aberrant crypt foci), polyps and tumours induced by the carcinogenic agent azoxymethane (AOM) as well as in a xenograft model of colon cancer in mice. CBD BDS and CBD reduced cell proliferation in tumoral, but not in healthy, cells. The effect of CBD BDS was counteracted by selective CB1 and CB2 receptor antagonists. Pure CBD reduced cell proliferation in a CB1-sensitive antagonist manner only. In binding assays, CBD BDS showed greater affinity than pure CBD for both CB1 and CB2 receptors, with pure CBD having very little affinity. In vivo, CBD BDS reduced AOM-induced preneoplastic lesions and polyps as well as tumour growth in the xenograft model of colon cancer. CBD BDS attenuates colon carcinogenesis and inhibits colorectal cancer cell proliferation via CB1 and CB2 receptor activation. The results may have some clinical relevance for the use of Cannabis-based medicines in cancer patients. Copyright © 2013 Elsevier GmbH. All rights reserved.

  2. Model for Physical Education Content Standards at Early Stages of Primary Education in the Republic of Macedonia

    OpenAIRE

    Klincarov, Ilija; Popeska, Biljana

    2011-01-01

    The aim of this article is to propose a model for designing national physical education content standards in early stages of primary education in the Republic of Macedonia. Proposed model is based on the findings about motor structure of children at the early stage of primary education obtained in researches realized in 5 primary schools in Skopje, the Republic of Macedonia, in relation with Macedonian PE curriculum, and overarching standards for children at this age in California, USA,chosen...

  3. Contents

    Directory of Open Access Journals (Sweden)

    Editor IJRED

    2012-11-01

    Full Text Available International Journal of Renewable Energy Development www.ijred.com Volume 1             Number 3            October 2012                ISSN 2252- 4940   CONTENTS OF ARTICLES page Design and Economic Analysis of a Photovoltaic System: A Case Study 65-73 C.O.C. Oko , E.O. Diemuodeke, N.F. Omunakwe, and E. Nnamdi     Development of Formaldehyde Adsorption using Modified Activated Carbon – A Review 75-80 W.D.P Rengga , M. Sudibandriyo and M. Nasikin     Process Optimization for Ethyl Ester Production in Fixed Bed Reactor Using Calcium Oxide Impregnated Palm Shell Activated Carbon (CaO/PSAC 81-86 A. Buasri , B. Ksapabutr, M. Panapoy and N. Chaiyut     Wind Resource Assessment in Abadan Airport in Iran 87-97 Mojtaba Nedaei       The Energy Processing by Power Electronics and its Impact on Power Quality 99-105 J. E. Rocha and B. W. D. C. Sanchez       First Aspect of Conventional Power System Assessment for High Wind Power Plants Penetration 107-113 A. Merzic , M. Music, and M. Rascic   Experimental Study on the Production of Karanja Oil Methyl Ester and Its Effect on Diesel Engine 115-122 N. Shrivastava,  , S.N. Varma and M. Pandey  

  4. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  5. Cognitive Language and Content Standards: Language Inventory of the Common Core State Standards in Mathematics and the Next Generation Science Standards

    Science.gov (United States)

    Winn, Kathleen M.; Mi Choi, Kyong; Hand, Brian

    2016-01-01

    STEM education is a current focus of many educators and policymakers and the Next Generation Science Standards (NGSS) with the Common Core State Standards in Mathematics (CCSSM) are foundational documents driving curricular and instructional decision making for teachers and students in K-8 classrooms across the United States. Thus, practitioners…

  6. "Rigor mortis" in a live patient.

    Science.gov (United States)

    Chakravarthy, Murali

    2010-03-01

    Rigor mortis is conventionally a postmortem change. Its occurrence suggests that death has occurred at least a few hours ago. The authors report a case of "Rigor Mortis" in a live patient after cardiac surgery. The likely factors that may have predisposed such premortem muscle stiffening in the reported patient are, intense low cardiac output status, use of unusually high dose of inotropic and vasopressor agents and likely sepsis. Such an event may be of importance while determining the time of death in individuals such as described in the report. It may also suggest requirement of careful examination of patients with muscle stiffening prior to declaration of death. This report is being published to point out the likely controversies that might arise out of muscle stiffening, which should not always be termed rigor mortis and/ or postmortem.

  7. [Rigor mortis -- a definite sign of death?].

    Science.gov (United States)

    Heller, A R; Müller, M P; Frank, M D; Dressler, J

    2005-04-01

    In the past years an ongoing controversial debate exists in Germany, regarding quality of the coroner's inquest and declaration of death by physicians. We report the case of a 90-year old female, who was found after an unknown time following a suicide attempt with benzodiazepine. The examination of the patient showed livores (mortis?) on the left forearm and left lower leg. Moreover, rigor (mortis?) of the left arm was apparent which prevented arm flexion and extension. The hypothermic patient with insufficient respiration was intubated and mechanically ventilated. Chest compressions were not performed, because central pulses were (hardly) palpable and a sinus bradycardia 45/min (AV-block 2 degrees and sole premature ventricular complexes) was present. After placement of an intravenous line (17 G, external jugular vein) the hemodynamic situation was stabilized with intermittent boli of epinephrine and with sodium bicarbonate. With improved circulation livores and rigor disappeared. In the present case a minimal central circulation was noted, which could be stabilized, despite the presence of certain signs of death ( livores and rigor mortis). Considering the finding of an abrogated peripheral perfusion (livores), we postulate a centripetal collapse of glycogen and ATP supply in the patients left arm (rigor), which was restored after resuscitation and reperfusion. Thus, it appears that livores and rigor are not sensitive enough to exclude a vita minima, in particular in hypothermic patients with intoxications. Consequently a careful ABC-check should be performed even in the presence of apparently certain signs of death, to avoid underdiagnosing a vita minima. Additional ECG- monitoring is required to reduce the rate of false positive declarations of death. To what extent basic life support by paramedics should commence when rigor and livores are present until physician DNR order, deserves further discussion.

  8. An ultramicroscopic study on rigor mortis.

    Science.gov (United States)

    Suzuki, T

    1976-01-01

    Gastrocnemius muscles taken from decapitated mice at various intervals after death and from mice killed by 2,4-dinitrophenol or mono-iodoacetic acid injection to induce rigor mortis soon after death, were observed by electron microscopy. The prominent appearance of many fine cross striations in the myofibrils (occurring about every 400 A) was considered to be characteristic of rigor mortis. These striations were caused by minute granules studded along the surfaces of both thick and thin filaments and appeared to be the bridges connecting the 2 kinds of filaments and accounted for the hardness and rigidity of the muscle.

  9. Einstein's Theory A Rigorous Introduction for the Mathematically Untrained

    CERN Document Server

    Grøn, Øyvind

    2011-01-01

    This book provides an introduction to the theory of relativity and the mathematics used in its processes. Three elements of the book make it stand apart from previously published books on the theory of relativity. First, the book starts at a lower mathematical level than standard books with tensor calculus of sufficient maturity to make it possible to give detailed calculations of relativistic predictions of practical experiments. Self-contained introductions are given, for example vector calculus, differential calculus and integrations. Second, in-between calculations have been included, making it possible for the non-technical reader to follow step-by-step calculations. Thirdly, the conceptual development is gradual and rigorous in order to provide the inexperienced reader with a philosophically satisfying understanding of the theory.  Einstein's Theory: A Rigorous Introduction for the Mathematically Untrained aims to provide the reader with a sound conceptual understanding of both the special and genera...

  10. Discourse Surrounding the International Education Standards for Professional Accountants (IES): A Content Analysis Approach

    Science.gov (United States)

    Sugahara, Satoshi; Wilson, Rachel

    2013-01-01

    The development and implementation of the International Education Standards (IES) for professional accountants is currently an important issue in accounting education and for educators interested in a shift toward international education standards more broadly. The purpose of this study is to investigate professional and research discourse…

  11. Reducing Content Variance and Improving Student Learning Outcomes: The Value of Standardization in a Multisection Course

    Science.gov (United States)

    Meuter, Matthew L.; Chapman, Kenneth J.; Toy, Daniel; Wright, Lauren K.; McGowan, William

    2009-01-01

    This article describes a standardization process for an introductory marketing course with multiple sections. The authors first outline the process used to develop a standardized set of marketing concepts to be used in all introductory marketing classes. They then discuss the benefits to both students and faculty that occur as a result of…

  12. Data Analysis and Statistics in Middle Grades: An Analysis of Content Standards

    Science.gov (United States)

    Sorto, M. Alejandra

    2011-01-01

    The purpose of the study reported herein was to identify the important aspects of statistical knowledge that students in the middle school grades in United States are expected to learn as well as what the teachers are expected to teach. A systematic study of 49 states standards and one set of national standards was used to identify these important…

  13. The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System

    Science.gov (United States)

    Mixon, Jason; Stuart, Jerry

    2009-01-01

    In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…

  14. Trends: Rigor Mortis in the Arts.

    Science.gov (United States)

    Blodget, Alden S.

    1991-01-01

    Outlines how past art education provided a refuge for students from the rigors of other academic subjects. Observes that in recent years art education has become "discipline based." Argues that art educators need to reaffirm their commitment to a humanistic way of knowing. (KM)

  15. Photoconductivity of amorphous silicon-rigorous modelling

    International Nuclear Information System (INIS)

    Brada, P.; Schauer, F.

    1991-01-01

    It is our great pleasure to express our gratitude to Prof. Grigorovici, the pioneer of the exciting field of amorphous state by our modest contribution to this area. In this paper are presented the outline of the rigorous modelling program of the steady-state photoconductivity in amorphous silicon and related materials. (Author)

  16. Moisture content of cereals at harvesting time by comparing microclimate values and standard weather data.

    NARCIS (Netherlands)

    Atzema, A.J.

    1993-01-01

    The moisture content of wheat and barley together with the weather elements were measured at 3 different experimental sites in the Netherlands in 1990-91. The difference in the dew point temperature in the screen[house] and in the field was small. However, the differences between air temperature in

  17. 12 CFR 621.13 - Content and standards-general rules.

    Science.gov (United States)

    2010-01-01

    ... Section 621.13 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM ACCOUNTING AND REPORTING REQUIREMENTS Report of Condition and Performance § 621.13 Content and standards—general rules. Each institution, including the Federal Agricultural Mortgage Corporation, shall prepare reports of condition and performance...

  18. Applying transpose matrix on advanced encryption standard (AES) for database content

    Science.gov (United States)

    Manurung, E. B. P.; Sitompul, O. S.; Suherman

    2018-03-01

    Advanced Encryption Standard (AES) is a specification for the encryption of electronic data established by the U.S. National Institute of Standards and Technology (NIST) and has been adopted by the U.S. government and is now used worldwide. This paper reports the impact of transpose matrix integration to AES. Transpose matrix implementation on AES is aimed at first stage of chypertext modifications for text based database security so that the confidentiality improves. The matrix is also able to increase the avalanche effect of the cryptography algorithm 4% in average.

  19. Standard format and content for radiological contingency plans for fuel cycle and materials facilities. Regulatory report

    International Nuclear Information System (INIS)

    1981-07-01

    This report is issued as guidance to those fuel cycle and major materials licensees who are required by the NRC to prepare and submit a radiological contingency plan. This Standard Format has been prepared to help assure uniformity and completeness in the preparation of those plans

  20. The fermion content of the Standard Model from a simple world-line theory

    Energy Technology Data Exchange (ETDEWEB)

    Mansfield, Paul, E-mail: P.R.W.Mansfield@durham.ac.uk

    2015-04-09

    We describe a simple model that automatically generates the sum over gauge group representations and chiralities of a single generation of fermions in the Standard Model, augmented by a sterile neutrino. The model is a modification of the world-line approach to chiral fermions.

  1. Standard format and content of license applications for plutonium processing and fuel fabrication plants

    International Nuclear Information System (INIS)

    1976-01-01

    The standard format suggested for use in applications for licenses to possess and use special nuclear materials in Pu processing and fuel fabrication plants is presented. It covers general description of the plant, summary safety assessment, site characteristics, principal design criteria, plant design, process systems, waste confinement and management, radiation protection, accident safety analysis, conduct of operations, operating controls and limits, and quality assurance

  2. Common Core Standards, Professional Texts, and Diverse Learners: A Qualitative Content Analysis

    Science.gov (United States)

    Yanoff, Elizabeth; LaDuke, Aja; Lindner, Mary

    2014-01-01

    This research study questioned the degree to which six professional texts guiding implementation of the Common Core Standards in reading address the needs of diverse learners. For the purposes of this research, diverse learners were specifically defined as above grade level readers, below grade level readers, and English learners. The researchers…

  3. Art as Resistance: Creating and Collecting Content for a Public Lesson on Standardization

    Science.gov (United States)

    Smith, Becky L. Noël; Shaw, Michael L.

    2014-01-01

    The negative emotional affects of standardized teaching and learning abound in public schools and work to create a melancholic, shared reality for teachers and students. The authors argue that teachers and students must acknowledge this melancholy and pursue shared inquiry around those emotions in order to help bring about understanding and the…

  4. 78 FR 23289 - Public Review of Draft National Shoreline Data Content Standard

    Science.gov (United States)

    2013-04-18

    ..., efficient and applicable to a broad base of government and private sector uses. Current practices have led... organizations from State, local and tribal governments, the academic community, and the private sector. The... definition of data models, schemas, entities, relationships, definitions, and crosswalks to related standards...

  5. Standard format and content of financial assurance mechanisms required for decommissioning under 10 CFR parts 30, 40, 70, and 72

    International Nuclear Information System (INIS)

    1990-06-01

    The purpose of this regulatory guide, ''Standard Format and Content of Financial Assurance Mechanisms Required for Decommissioning Under 10 CFR Parts 30, 40, 70, and 72,'' is to provide guidance acceptable to the NRC staff on the information to be provided for establishing financial assurance for decommissioning and to establish a standard format for presenting the information. Use of the standard format will help ensure that the financial instruments contain the information required by 10 CFR Parts 30, 40, 70, and 72; aid the applicant and NRC staff in ensuring that the information is complete; and help persons reading the financial instruments to locate information. This guide address financial assurance for decommissioning of facilities under materials licenses granted under Parts 30, 40, 70, and 72. These parts include licensees in the following categories: Part 30, Byproduct Material; Part 40, Source Material; Part 70, Special Nuclear Material; and Part 72, Independent Spent Fuel Storage Installations

  6. Accelerating Biomedical Discoveries through Rigor and Transparency.

    Science.gov (United States)

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  7. National Sexuality Education Standards: Core Content and Skills, K-12. A Special Publication of the Journal of School Health. Special Report

    Science.gov (United States)

    American School Health Association (NJ1), 2012

    2012-01-01

    The goal of this paper, "National Sexuality Education Standards: Core Content and Skills, K-12," is to provide clear, consistent and straightforward guidance on the "essential minimum, core content" for sexuality education that is developmentally and age-appropriate for students in grades K-12. The development of these standards is a result of an…

  8. Content accessibility of Web documents: Overview of concepts and needed standards

    DEFF Research Database (Denmark)

    Alapetite, A.

    2006-01-01

    The concept of Web accessibility refers to a combined set of measures, namely, how easily and how efficiently different types of users may make use of a given service. While some recommendations for accessibility are focusing on people with variousspecific disabilities, this document seeks...... to broaden the scope to any type of user and any type of use case. The document provides an introduction to some required concepts and technical standards for designing accessible Web sites. A brief review of thelegal requirements in a few countries for Web accessibility complements the recommendations...

  9. Standard format and content for emergency plans for fuel cycle and materials facilities

    International Nuclear Information System (INIS)

    1990-09-01

    This regulatory guides is being developed to provide guidance acceptable to the NRC staff on the information to be included in emergency plans and to establish a format for presenting the information. Use of a standard format will help ensure uniformity and completeness in the preparation of emergency plans. An acceptable emergency plan should describe the licensed activities conducted at the facility and the types of accidents that might occur. It should provide information on classifying postulated accidents and the licensee's procedures for notifying and coordinating with offsite authorities. The plan should provide information on emergency response measures that might be necessary, the equipment and facilities available to respond to an emergency, and how the licensee will maintain emergency preparedness capability. It should describe the records and reports that will be maintained. There should also be a section on recovery after an accident and plans for restoring the facility to a safe condition. 4 refs

  10. Standards for the contents of heavy metals in soils of some states

    Directory of Open Access Journals (Sweden)

    Yu.N. Vodyanitskii

    2016-09-01

    Full Text Available In line with the present-day ecological and toxicological data obtained by Dutch ecologists, heavy metals/metalloids form the following succession according to their hazard degree in soils: Se > Tl > Sb > Cd > V > Hg > Ni > Cu > Cr > As > Ba. This sequence substantially differs from the succession of heavy elements presented in the general toxicological Russian GOST (State Norms and Standards, which considers As, Cd, Hg, Se, Pb, and Zn to be strongly hazardous elements, whereas Co, Ni, Mo, Sb, and Cr to be moderately hazardous. As compared to the Dutch general toxicological approach, the hazard of lead, zinc, and cobalt is lower in soils, and that of vanadium, antimony, and barium is higher in Russia. MPC must been adopted for strongly hazardous thallium, selenium, and vanadium in Russia.

  11. ENDF/B-5 Standards Data Library (including modifications made in 1986). Summary of contents and documentation

    International Nuclear Information System (INIS)

    DayDay, N.; Lemmel, H.D.

    1986-01-01

    This document summarizes the contents and documentation of the ENDF/B-5 Standards Data Library (EN5-ST) released in September 1979. The library contains complete evaluations for all significant neutron reactions in the energy range 10 -5 eV to 20 MeV for H-1, He-3, Li-6, B-10, C-12, Au-197 and U-235 isotopes. In 1986 the files for C-12, Au-197 and U-235 were slightly modified. The entire library or selective retrievals from it can be obtained free of charge from the IAEA Nuclear Data Section. (author)

  12. ENDF/B-5 Standards Data Library (including modifications made in 1986). Summary of contents and documentation

    Energy Technology Data Exchange (ETDEWEB)

    DayDay, N; Lemmel, H D

    1986-05-01

    This document summarizes the contents and documentation of the ENDF/B-5 Standards Data Library (EN5-ST) released in September 1979. The library contains complete evaluations for all significant neutron reactions in the energy range 10{sup -5}eV to 20 MeV for H-1, He-3, Li-6, B-10, C-12, Au-197 and U-235 isotopes. In 1986 the files for C-12, Au-197 and U-235 were slightly modified. The entire library or selective retrievals from it can be obtained free of charge from the IAEA Nuclear Data Section. (author) Refs, figs, tabs

  13. Rigorous quantum limits on monitoring free masses and harmonic oscillators

    Science.gov (United States)

    Roy, S. M.

    2018-03-01

    There are heuristic arguments proposing that the accuracy of monitoring position of a free mass m is limited by the standard quantum limit (SQL): σ2( X (t ) ) ≥σ2( X (0 ) ) +(t2/m2) σ2( P (0 ) ) ≥ℏ t /m , where σ2( X (t ) ) and σ2( P (t ) ) denote variances of the Heisenberg representation position and momentum operators. Yuen [Phys. Rev. Lett. 51, 719 (1983), 10.1103/PhysRevLett.51.719] discovered that there are contractive states for which this result is incorrect. Here I prove universally valid rigorous quantum limits (RQL), viz. rigorous upper and lower bounds on σ2( X (t ) ) in terms of σ2( X (0 ) ) and σ2( P (0 ) ) , given by Eq. (12) for a free mass and by Eq. (36) for an oscillator. I also obtain the maximally contractive and maximally expanding states which saturate the RQL, and use the contractive states to set up an Ozawa-type measurement theory with accuracies respecting the RQL but beating the standard quantum limit. The contractive states for oscillators improve on the Schrödinger coherent states of constant variance and may be useful for gravitational wave detection and optical communication.

  14. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  15. Development of rigor mortis is not affected by muscle volume.

    Science.gov (United States)

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  16. The Authoritarian Personality in Emerging Adulthood: Longitudinal Analysis Using Standardized Scales, Observer Ratings, and Content Coding of the Life Story.

    Science.gov (United States)

    Peterson, Bill E; Pratt, Michael W; Olsen, Janelle R; Alisat, Susan

    2016-04-01

    Three different methods (a standardized scale, an observer-based Q-sort, and content coding of narratives) were used to study the continuity of authoritarianism longitudinally in emerging and young adults. Authoritarianism was assessed in a Canadian sample (N = 92) of men and women at ages 19 and 32 with Altemeyer's (1996) Right-Wing Authoritarianism (RWA) Scale. In addition, components of the authoritarian personality were assessed at age 26 through Q-sort observer methods (Block, 2008) and at age 32 through content coding of life stories. Age 19 authoritarianism predicted the Q-sort and life story measures of authoritarianism. Two hierarchical regression analyses showed that the Q-sort and life story measures of authoritarianism also predicted the RWA scale at age 32 beyond educational level and parental status, and even after the inclusion of age 19 RWA. Differences and similarities in the pattern of correlates for the Q-sort and life story measures are discussed, including the overall lack of results for authoritarian aggression. Content in narratives may be the result of emerging adult authoritarianism and may serve to maintain levels of authoritarianism in young adulthood. © 2014 Wiley Periodicals, Inc.

  17. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    International Nuclear Information System (INIS)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-01-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  18. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  19. Rapid whole brain myelin water content mapping without an external water standard at 1.5T.

    Science.gov (United States)

    Nguyen, Thanh D; Spincemaille, Pascal; Gauthier, Susan A; Wang, Yi

    2017-06-01

    The objective of this study is to develop rapid whole brain mapping of myelin water content (MWC) at 1.5T. The Fast Acquisition with Spiral Trajectory and T2prep (FAST-T2) pulse sequence originally developed for myelin water fraction (MWF) mapping was modified to obtain fast mapping of T1 and receiver coil sensitivity needed for MWC computation. The accuracy of the proposed T1 mapping was evaluated by comparing with the standard IR-FSE method. Numerical simulations were performed to assess the accuracy and reliability of the proposed MWC mapping. We also compared MWC values obtained with either cerebrospinal fluid (CSF) or an external water tube attached to the subject's head as the water reference. Our results from healthy volunteers show that whole brain MWC mapping is feasible in 7min and provides accurate brain T1 values. Regional brain WC and MWC measurements obtained with the internal CSF-based water standard showed excellent correlation (R>0.99) and negligible bias within narrow limits of agreement compared to those obtained with an external water standard. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Standard format and content of financial assurance mechanisms required for decommissioning under 10 CFR parts 30, 40, 70, and 72

    International Nuclear Information System (INIS)

    1990-01-01

    The Nuclear Regulatory Commission (NRC) has established technical and financial regulations for decommissioning licensed nuclear facilities (53 FR 24018, June 27, 1988). The regulations address decommissioning planning needs, timing, funding methods, and environmental review requirements for public and private facilities holding licenses under 10 CFR Parts 30, 40, 50, 70, and 72, with the exception of uranium mills. The intent of the regulations is to ensure that the decommissioning of all licensed facilities will be accomplished in a safe and timely manner and that licensees will provide adequate funds to cover all costs associated with decommissioning. The purpose of this regulatory guide, ''Standard Format and Content of Financial Assurance Mechanisms Required for Decommissioning Under 10 CFR Parts 30, 40, 70, and 72,'' is to provide guidance acceptable to the NRC staff on the information to be provided for establishing financial assurance for decommissioning and to establish a standard format for presenting the information. Use of the standard format will (1) help ensure that the financial instruments contain the information required by 10 CFR Parts 30, 40, 70, and 72, (2) aid the applicant and NRC staff in ensuring that the information is complete, and (3) help persons reading the financial instruments to locate information. 5 refs., 13 figs

  1. Statistics for mathematicians a rigorous first course

    CERN Document Server

    Panaretos, Victor M

    2016-01-01

    This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.

  2. Standard format and content guide for financial assurance mechanisms required for decommissioning under 10 CFR parts 30, 40, 70, and 72

    International Nuclear Information System (INIS)

    1989-08-01

    The Standard Format and Content Guide for Financial Assurance Mechanisms Required for Decommissioning under 10 CFR Parts 30, 40, 70, and 72, discusses the information to be provided in a license application and established a uniform format for presenting the information required to meet the decommissioning licensing requirements. The use of the Standard Format and Content Guide will (1) help ensure that the license application contains the information required by the regulations, (2) aid the applicant in ensuring that the information is complete, (3) help persons reading the Standard Format and Content Guide to locate information, and (4) contribute to shortening the time required for the review process. The Standard Format and Content Guide ensures that the information required to perform the review is provided, and in a useable format

  3. Standard format and content of a licensee physical protection plan for strategic special nuclear material in transit - April 1980

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    A predetermined plan to respond to safeguards contingency events is required to be prepared, based on personnel and other physical protection resources described in the Physical Protection Plan for strategic special nuclear material (SSNM) in transit. Specific requirements for the contingency plan are provided in Appendix C. Licensee Safeguards Contingency Plans, to 10 CFR Part 73. Regulatory Guide 5.56, Standard Format and Content of Safeguards Contingency Plans for Transportation, provides guidance for the preparation of transportation contingency plans. Licensee is reminded that all three submissions - the Physical Protection Plan, the Physical Protection Arrangements for Specific Shipments, and the Safeguards Contingency Plan - together describe the system for physical protection of each particular shipment. They should be developed and maintained to be completely consistent with each other for each shipment

  4. PRO development: rigorous qualitative research as the crucial foundation.

    Science.gov (United States)

    Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-10-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.

  5. Content and Alignment of State Writing Standards and Assessments as Predictors of Student Writing Achievement: An Analysis of 2007 National Assessment of Educational Progress Data

    Science.gov (United States)

    Troia, Gary A.; Olinghouse, Natalie G.; Zhang, Mingcai; Wilson, Joshua; Stewart, Kelly A.; Mo, Ya; Hawkins, Lisa

    2018-01-01

    We examined the degree to which content of states' writing standards and assessments (using measures of content range, frequency, balance, and cognitive complexity) and their alignment were related to student writing achievement on the 2007 National Assessment of Educational Progress (NAEP), while controlling for student, school, and state…

  6. A Content Analysis of Immigration in Traditional, New, and Non-Gateway State Standards for U.S. History and Civics

    Science.gov (United States)

    Hilburn, Jeremy; Journell, Wayne; Buchanan, Lisa Brown

    2016-01-01

    In this content analysis of state U.S. History and Civics standards, we compared the treatment of immigration across three types of states with differing immigration demographics. Analyzing standards from 18 states from a critical race methodology perspective, our findings indicated three sets of tensions: a unified American story versus local…

  7. Rigorous theory of molecular orientational nonlinear optics

    International Nuclear Information System (INIS)

    Kwak, Chong Hoon; Kim, Gun Yeup

    2015-01-01

    Classical statistical mechanics of the molecular optics theory proposed by Buckingham [A. D. Buckingham and J. A. Pople, Proc. Phys. Soc. A 68, 905 (1955)] has been extended to describe the field induced molecular orientational polarization effects on nonlinear optics. In this paper, we present the generalized molecular orientational nonlinear optical processes (MONLO) through the calculation of the classical orientational averaging using the Boltzmann type time-averaged orientational interaction energy in the randomly oriented molecular system under the influence of applied electric fields. The focal points of the calculation are (1) the derivation of rigorous tensorial components of the effective molecular hyperpolarizabilities, (2) the molecular orientational polarizations and the electronic polarizations including the well-known third-order dc polarization, dc electric field induced Kerr effect (dc Kerr effect), optical Kerr effect (OKE), dc electric field induced second harmonic generation (EFISH), degenerate four wave mixing (DFWM) and third harmonic generation (THG). We also present some of the new predictive MONLO processes. For second-order MONLO, second-order optical rectification (SOR), Pockels effect and difference frequency generation (DFG) are described in terms of the anisotropic coefficients of first hyperpolarizability. And, for third-order MONLO, third-order optical rectification (TOR), dc electric field induced difference frequency generation (EFIDFG) and pump-probe transmission are presented

  8. Rigorous modelling of light's intensity angular-profile in Abbe refractometers with absorbing homogeneous fluids

    International Nuclear Information System (INIS)

    García-Valenzuela, A; Contreras-Tello, H; Márquez-Islas, R; Sánchez-Pérez, C

    2013-01-01

    We derive an optical model for the light intensity distribution around the critical angle in a standard Abbe refractometer when used on absorbing homogenous fluids. The model is developed using rigorous electromagnetic optics. The obtained formula is very simple and can be used suitably in the analysis and design of optical sensors relying on Abbe type refractometry.

  9. Community historians and the dilemma of rigor vs relevance : A comment on Danziger and van Rappard

    NARCIS (Netherlands)

    Dehue, Trudy

    1998-01-01

    Since the transition from finalism to contextualism, the history of science seems to be caught up in a basic dilemma. Many historians fear that with the new contextualist standards of rigorous historiography, historical research can no longer be relevant to working scientists themselves. The present

  10. Analysis of chemistry textbook content and national science education standards in terms of air quality-related learning goals

    Science.gov (United States)

    Naughton, Wendy

    In this study's Phase One, representatives of nine municipal agencies involved in air quality education were interviewed and interview transcripts were analyzed for themes related to what citizens need to know or be able to do regarding air quality concerns. Based on these themes, eight air quality Learning Goal Sets were generated and validated via peer and member checks. In Phase Two, six college-level, liberal-arts chemistry textbooks and the National Science Education Standards (NSES) were analyzed for congruence with Phase One learning goals. Major categories of desired citizen understandings highlighted in agency interviews concerned air pollution sources, impact, detection, and transport. Identified cognitive skills focused on information-gathering and -evaluating skills, enabling informed decision-making. A content match was found between textbooks and air quality learning goals, but most textbooks fail to address learning goals that remediate citizen misconceptions and inabilities---particularly those with a "personal experience" focus. A partial match between NSES and air quality learning goals was attributed to differing foci: Researcher-derived learning goals deal specifically with air quality, while NSES focus is on "fundamental science concepts," not "many science topics." Analysis of findings within a situated cognition framework suggests implications for instruction and NSES revision.

  11. Standard Test Method for Oxygen Content Using a 14-MeV Neutron Activation and Direct-Counting Technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This test method covers the measurement of oxygen concentration in almost any matrix by using a 14-MeV neutron activation and direct-counting technique. Essentially, the same system may be used to determine oxygen concentrations ranging from over 50 % to about 10 g/g, or less, depending on the sample size and available 14-MeV neutron fluence rates. Note 1 - The range of analysis may be extended by using higher neutron fluence rates, larger samples, and higher counting efficiency detectors. 1.2 This test method may be used on either solid or liquid samples, provided that they can be made to conform in size, shape, and macroscopic density during irradiation and counting to a standard sample of known oxygen content. Several variants of this method have been described in the technical literature. A monograph is available which provides a comprehensive description of the principles of activation analysis using a neutron generator (1). 1.3 The values stated in either SI or inch-pound units are to be regarded...

  12. Specialized food composition dataset for vitamin D content in foods based on European standards: Application to dietary intake assessment.

    Science.gov (United States)

    Milešević, Jelena; Samaniego, Lourdes; Kiely, Mairead; Glibetić, Maria; Roe, Mark; Finglas, Paul

    2018-02-01

    A review of national nutrition surveys from 2000 to date, demonstrated high prevalence of vitamin D intakes below the EFSA Adequate Intake (AI) (d vitamin D) in adults across Europe. Dietary assessment and modelling are required to monitor efficacy and safety of ongoing strategic vitamin D fortification. To support these studies, a specialized vitamin D food composition dataset, based on EuroFIR standards, was compiled. The FoodEXplorer™ tool was used to retrieve well documented analytical data for vitamin D and arrange the data into two datasets - European (8 European countries, 981 data values) and US (1836 data values). Data were classified, using the LanguaL™, FoodEX2 and ODIN classification systems and ranked according to quality criteria. Significant differences in the content, quality of data values, missing data on vitamin D 2 and 25(OH)D 3 and documentation of analytical methods were observed. The dataset is available through the EuroFIR platform. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Rigor or mortis: best practices for preclinical research in neuroscience.

    Science.gov (United States)

    Steward, Oswald; Balice-Gordon, Rita

    2014-11-05

    Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. [Experimental study of restiffening of the rigor mortis].

    Science.gov (United States)

    Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M

    2001-11-01

    To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.

  15. Mathematical framework for fast and rigorous track fit for the ZEUS detector

    Energy Technology Data Exchange (ETDEWEB)

    Spiridonov, Alexander

    2008-12-15

    In this note we present a mathematical framework for a rigorous approach to a common track fit for trackers located in the inner region of the ZEUS detector. The approach makes use of the Kalman filter and offers a rigorous treatment of magnetic field inhomogeneity, multiple scattering and energy loss. We describe mathematical details of the implementation of the Kalman filter technique with a reduced amount of computations for a cylindrical drift chamber, barrel and forward silicon strip detectors and a forward straw drift chamber. Options with homogeneous and inhomogeneous field are discussed. The fitting of tracks in one ZEUS event takes about of 20ms on standard PC. (orig.)

  16. Establishment of an authenticated physical standard for gamma spectrometric determination of the U-235 content of MTR fuel and evaluation of measurement procedures

    International Nuclear Information System (INIS)

    Fleck, C.M.

    1979-12-01

    Measurements of U-235 content in a standard MTR fuel element were carried out, using scintillation and semi-conductor spectrometers. Three different types of measurement were carried out: a) Comparison of different primary standards among one another and with single fuel plates. b) Calibration of the MTR fuel element as an authenticated physical standard. c) Evaluation of over all errors in assay measurements on MTR fuel elements. The error of the whole assay measurement will be approximately 0.9%. The Uranium distribution in the single fuel plates is the original source of error. In the case of equal Uranium contents in all fuel plates of one fuel assembly, the error of assay measurements would be about 0.3% relative to the primary standards

  17. Measurement of the fluorine content of three NBS standard reference materials by use of the /sup 19/F(p, p'. gamma. )/sup 19/F reaction

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, A L; Kraner, H W; Shroy, R E; Jones, K W [Brookhaven National Lab., Upton, NY (USA)

    1984-08-01

    The fluorine contents of National Bureau of Standards (NBS) Standard Reference Materials (SRM) 91, opal glass; 120b, phosphate rock; and 2671a, freeze-dried urine; have been measured using the /sup 19/F(p,p'..gamma..)/sup 19/F reaction at a proton energy of 3.1 MeV. The results are in good agreement with the values certified by the NBS.

  18. Long persistence of rigor mortis at constant low temperature.

    Science.gov (United States)

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  19. Rigorous solution to Bargmann-Wigner equation for integer spin

    CERN Document Server

    Huang Shi Zhong; Wu Ning; Zheng Zhi Peng

    2002-01-01

    A rigorous method is developed to solve the Bargamann-Wigner equation for arbitrary integer spin in coordinate representation in a step by step way. The Bargmann-Wigner equation is first transformed to a form easier to solve, the new equations are then solved rigorously in coordinate representation, and the wave functions in a closed form are thus derived

  20. Using grounded theory as a method for rigorously reviewing literature

    NARCIS (Netherlands)

    Wolfswinkel, J.; Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.

    2013-01-01

    This paper offers guidance to conducting a rigorous literature review. We present this in the form of a five-stage process in which we use Grounded Theory as a method. We first probe the guidelines explicated by Webster and Watson, and then we show the added value of Grounded Theory for rigorously

  1. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    Science.gov (United States)

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  2. What's New in Children's Literature for the Children of Louisiana? A Selected Annotated Bibliography with Readability Levels (Selected) and Associated Louisiana Content Standards

    Science.gov (United States)

    Webre, Elizabeth C.

    2011-01-01

    An annotated list of children's books published within the last 15 years and related to Louisiana culture, environment, and economics are linked to the Louisiana Content Standards. Readability levels of selected books are included, providing guidance as to whether a book is appropriate for independent student use. The thirty-three books listed are…

  3. E-Learning Content Design Standards Based on Interactive Digital Concepts Maps in the Light of Meaningful and Constructivist Learning Theory

    Science.gov (United States)

    Afify, Mohammed Kamal

    2018-01-01

    The present study aims to identify standards of interactive digital concepts maps design and their measurement indicators as a tool to develop, organize and administer e-learning content in the light of Meaningful Learning Theory and Constructivist Learning Theory. To achieve the objective of the research, the author prepared a list of E-learning…

  4. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  5. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  6. Implementation of an Evidence-Based and Content Validated Standardized Ostomy Algorithm Tool in Home Care: A Quality Improvement Project.

    Science.gov (United States)

    Bare, Kimberly; Drain, Jerri; Timko-Progar, Monica; Stallings, Bobbie; Smith, Kimberly; Ward, Naomi; Wright, Sandra

    Many nurses have limited experience with ostomy management. We sought to provide a standardized approach to ostomy education and management to support nurses in early identification of stomal and peristomal complications, pouching problems, and provide standardized solutions for managing ostomy care in general while improving utilization of formulary products. This article describes development and testing of an ostomy algorithm tool.

  7. Is Collaborative, Community-Engaged Scholarship More Rigorous than Traditional Scholarship? On Advocacy, Bias, and Social Science Research

    Science.gov (United States)

    Warren, Mark R.; Calderón, José; Kupscznk, Luke Aubry; Squires, Gregory; Su, Celina

    2018-01-01

    Contrary to the charge that advocacy-oriented research cannot meet social science research standards because it is inherently biased, the authors of this article argue that collaborative, community-engaged scholarship (CCES) must meet high standards of rigor if it is to be useful to support equity-oriented, social justice agendas. In fact, they…

  8. Monitoring muscle optical scattering properties during rigor mortis

    Science.gov (United States)

    Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.

    2007-09-01

    Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.

  9. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  10. Rigorous covariance propagation of geoid errors to geodetic MDT estimates

    Science.gov (United States)

    Pail, R.; Albertella, A.; Fecher, T.; Savcenko, R.

    2012-04-01

    The mean dynamic topography (MDT) is defined as the difference between the mean sea surface (MSS) derived from satellite altimetry, averaged over several years, and the static geoid. Assuming geostrophic conditions, from the MDT the ocean surface velocities as important component of global ocean circulation can be derived from it. Due to the availability of GOCE gravity field models, for the very first time MDT can now be derived solely from satellite observations (altimetry and gravity) down to spatial length-scales of 100 km and even below. Global gravity field models, parameterized in terms of spherical harmonic coefficients, are complemented by the full variance-covariance matrix (VCM). Therefore, for the geoid component a realistic statistical error estimate is available, while the error description of the altimetric component is still an open issue and is, if at all, attacked empirically. In this study we make the attempt to perform, based on the full gravity VCM, rigorous error propagation to derived geostrophic surface velocities, thus also considering all correlations. For the definition of the static geoid we use the third release of the time-wise GOCE model, as well as the satellite-only combination model GOCO03S. In detail, we will investigate the velocity errors resulting from the geoid component in dependence of the harmonic degree, and the impact of using/no using covariances on the MDT errors and its correlations. When deriving an MDT, it is spectrally filtered to a certain maximum degree, which is usually driven by the signal content of the geoid model, by applying isotropic or non-isotropic filters. Since this filtering is acting also on the geoid component, the consistent integration of this filter process into the covariance propagation shall be performed, and its impact shall be quantified. The study will be performed for MDT estimates in specific test areas of particular oceanographic interest.

  11. A preliminary comparative study on the content of cesium, thorium and uranium in IAEA standard reference material

    International Nuclear Information System (INIS)

    Zhang Jing; Liu Husheng; Wang Xiaoyan; Wang Naifen

    2000-01-01

    This paper presents the detection of Cs, Th and U in 6 standard reference materials provided by IAEA, using the ICP-MS method. The bismuth (Bi) was selected as internal standard element for the range compensation of matrix inhibitory effect and flow of the sensibility. The detection limit of the 3 elements was limited in the range of 0.0006∼2ng/ml, 0.4∼0.5g sample was taken and digested by acid, and detected directly by ICP-MS. The recovery of standard addition was 82.1∼100.1%

  12. Effect of standardizing the lactose content of cheesemilk on the properties of low-moisture, part-skim Mozzarella cheese.

    Science.gov (United States)

    Moynihan, A C; Govindasamy-Lucey, S; Molitor, M; Jaeggi, J J; Johnson, M E; McSweeney, P L H; Lucey, J A

    2016-10-01

    The texture, functionality, and quality of Mozzarella cheese are affected by critical parameters such as pH and the rate of acidification. Acidification is typically controlled by the selection of starter culture and temperature used during cheesemaking, as well as techniques such as curd washing or whey dilution, to reduce the residual curd lactose content and decrease the potential for developed acidity. In this study, we explored an alternative approach: adjusting the initial lactose concentration in the milk before cheesemaking. We adjusted the concentration of substrate available to form lactic acid. We added water to decrease the lactose content of the milk, but this also decreased the protein content, so we used ultrafiltration to help maintain a constant protein concentration. We used 3 milks with different lactose-to-casein ratios: one at a high level, 1.8 (HLC, the normal level in milk); one at a medium level, 1.3 (MLC); and one at a low level, 1.0 (LLC). All milks had similar total casein (2.5%) and fat (2.5%) content. We investigated the composition, texture, and functional and sensory properties of low-moisture, part-skim Mozzarella manufactured from these milks when the cheeses were ripened at 4°C for 84d. All cheeses had similar pH values at draining and salting, resulting in cheeses with similar total calcium contents. Cheeses made with LLC milk had higher pH values than the other cheeses throughout ripening. Cheeses had similar moisture contents. The LLC and MLC cheeses had lower levels of lactose, galactose, lactic acid, and insoluble calcium compared with HLC cheese. The lactose-to-casein ratio had no effect on the levels of proteolysis. The LLC and MLC cheeses were harder than the HLC cheese during ripening. Maximum loss tangent (LT), an index of cheese meltability, was lower for the LLC cheese until 28d of ripening, but after 28d, all treatments exhibited similar maximum LT values. The temperature where LT=1 (crossover temperature), an index

  13. Tenderness of pre- and post rigor lamb longissimus muscle.

    Science.gov (United States)

    Geesink, Geert; Sujang, Sadi; Koohmaraie, Mohammad

    2011-08-01

    Lamb longissimus muscle (n=6) sections were cooked at different times post mortem (prerigor, at rigor, 1dayp.m., and 7 days p.m.) using two cooking methods. Using a boiling waterbath, samples were either cooked to a core temperature of 70 °C or boiled for 3h. The latter method was meant to reflect the traditional cooking method employed in countries where preparation of prerigor meat is practiced. The time postmortem at which the meat was prepared had a large effect on the tenderness (shear force) of the meat (PCooking prerigor and at rigor meat to 70 °C resulted in higher shear force values than their post rigor counterparts at 1 and 7 days p.m. (9.4 and 9.6 vs. 7.2 and 3.7 kg, respectively). The differences in tenderness between the treatment groups could be largely explained by a difference in contraction status of the meat after cooking and the effect of ageing on tenderness. Cooking pre and at rigor meat resulted in severe muscle contraction as evidenced by the differences in sarcomere length of the cooked samples. Mean sarcomere lengths in the pre and at rigor samples ranged from 1.05 to 1.20 μm. The mean sarcomere length in the post rigor samples was 1.44 μm. Cooking for 3 h at 100 °C did improve the tenderness of pre and at rigor prepared meat as compared to cooking to 70 °C, but not to the extent that ageing did. It is concluded that additional intervention methods are needed to improve the tenderness of prerigor cooked meat. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Diabetes prevention information in Japanese magazines with the largest print runs. Content analysis using clinical guidelines as a standard.

    Science.gov (United States)

    Noda, Emi; Mifune, Taka; Nakayama, Takeo

    2013-01-01

    To characterize information on diabetes prevention appearing in Japanese general health magazines and to examine the agreement of the content with that in clinical practice guidelines for the treatment of diabetes in Japan. We used the Japanese magazines' databases provided by the Media Research Center and selected magazines with large print runs published in 2006. Two medical professionals independently conducted content analysis based on items in the diabetes prevention guidelines. The number of pages for each item and agreement with the information in the guidelines were determined. We found 63 issues of magazines amounting to 8,982 pages; 484 pages included diabetes prevention related content. For 23 items included in the diabetes prevention guidelines, overall agreement of information printed in the magazines with that in the guidelines was 64.5% (471 out of 730). The number of times these items were referred to in the magazines varied widely, from 247 times for food items to 0 times for items on screening for pregnancy-induced diabetes, dyslipidemia, and hypertension. Among the 20 items that were referred to at least once, 18 items showed more than 90% agreement with the guidelines. However, there was poor agreement for information on vegetable oil (2/14, 14%) and for specific foods (5/247, 2%). For the fatty acids category, "fat" was not mentioned in the guidelines; however, the term frequently appeared in magazines. "Uncertainty" was never mentioned in magazines for specific food items. The diabetes prevention related content in the health magazines differed from that defined in clinical practice guidelines. Most information in the magazines agreed with the guidelines, however some items were referred to inappropriately. To disseminate correct information to the public on diabetes prevention, health professionals and the media must collaborate.

  15. Use of Bayesian Methods to Analyze and Visualize Content Uniformity Capability Versus United States Pharmacopeia and ASTM Standards.

    Science.gov (United States)

    Hofer, Jeffrey D; Rauk, Adam P

    2017-02-01

    The purpose of this work was to develop a straightforward and robust approach to analyze and summarize the ability of content uniformity data to meet different criteria. A robust Bayesian statistical analysis methodology is presented which provides a concise and easily interpretable visual summary of the content uniformity analysis results. The visualization displays individual batch analysis results and shows whether there is high confidence that different content uniformity criteria could be met a high percentage of the time in the future. The 3 tests assessed are as follows: (a) United States Pharmacopeia Uniformity of Dosage Units , (b) a specific ASTM E2810 Sampling Plan 1 criterion to potentially be used for routine release testing, and (c) another specific ASTM E2810 Sampling Plan 2 criterion to potentially be used for process validation. The approach shown here could readily be used to create similar result summaries for other potential criteria. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  16. Standard format and content for emergency plans for fuel-cycle and materials facilities: Draft report for comment

    International Nuclear Information System (INIS)

    1987-11-01

    This report is issued as guidance to those fuel-cycle and major materials licensees who are required by the NRC to prepare and submit an emergency plan. This Standard Format has been prepared to help ensure uniformity and completeness in the preparation of those plans

  17. 78 FR 951 - Accessible Medical Device Labeling in a Standard Content and Format Public Workshop; Request for...

    Science.gov (United States)

    2013-01-07

    ... format so that patients, caregivers, and healthcare providers may access and utilize device labeling as... labeling, and what they would want in a standard version of device labeling. Key findings from the survey... survey with the National Family Caregivers Association (NFCA) on medical device labeling to elicit home...

  18. 48 CFR 311.7001 - Section 508 accessibility standards for HHS Web site content and communications materials.

    Science.gov (United States)

    2010-10-01

    ..., documents, charts, posters, presentations (such as Microsoft PowerPoint), or video material that is specifically intended for publication on, or delivery via, an HHS-owned or -funded Web site, the Project... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide...

  19. Characterization of primary standards for use in the HPLC analysis of the procyanidin content of cocoa and chocolate containing products.

    Science.gov (United States)

    Hurst, William J; Stanley, Bruce; Glinski, Jan A; Davey, Matthew; Payne, Mark J; Stuart, David A

    2009-10-15

    This report describes the characterization of a series of commercially available procyanidin standards ranging from dimers DP = 2 to decamers DP = 10 for the determination of procyanidins from cocoa and chocolate. Using a combination of HPLC with fluorescence detection and MALDI-TOF mass spectrometry, the purity of each standard was determined and these data were used to determine relative response factors. These response factors were compared with other response factors obtained from published methods. Data comparing the procyanidin analysis of a commercially available US dark chocolate calculated using each of the calibration methods indicates divergent results and demonstrate that previous methods may significantly underreport the procyanidins in cocoa-containing products. These results have far reaching implications because the previous calibration methods have been used to develop data for a variety of scientific reports, including food databases and clinical studies.

  20. Melinda - A custom search engine that provides access to locally-developed content using the HL7 Infobutton standard.

    Science.gov (United States)

    Wan, Yik-Ki J; Staes, Catherine J

    2016-01-01

    Healthcare organizations use care pathways to standardize care, but once developed, adoption rates often remain low. One challenge for usage concerns clinicians' difficulty in accessing guidance when it is most needed. Although the HL7 'Infobutton Standard' allows clinicians easier access to external references, access to locally-developed resources often requires clinicians to deviate from their normal electronic health record (EHR) workflow to use another application. To address this gap between internal and external resources, we reviewed the literature and existing practices at the University of Utah Health Care. We identify the requirements to meet the needs of a healthcare enterprise and clinicians, describe the design and development of a prototype to aggregate both internal and external resources from within or outside the EHR, and evaluated strengths and limitations of the prototype. The system is functional but not implemented in a live EHR environment. We suggest next steps and enhancements.

  1. Estimation of the breaking of rigor mortis by myotonometry.

    Science.gov (United States)

    Vain, A; Kauppila, R; Vuori, E

    1996-05-31

    Myotonometry was used to detect breaking of rigor mortis. The myotonometer is a new instrument which measures the decaying oscillations of a muscle after a brief mechanical impact. The method gives two numerical parameters for rigor mortis, namely the period and decrement of the oscillations, both of which depend on the time period elapsed after death. In the case of breaking the rigor mortis by muscle lengthening, both the oscillation period and decrement decreased, whereas, shortening the muscle caused the opposite changes. Fourteen h after breaking the stiffness characteristics of the right and left m. biceps brachii, or oscillation periods, were assimilated. However, the values for decrement of the muscle, reflecting the dissipation of mechanical energy, maintained their differences.

  2. Physiological studies of muscle rigor mortis in the fowl

    International Nuclear Information System (INIS)

    Nakahira, S.; Kaneko, K.; Tanaka, K.

    1990-01-01

    A simple system was developed for continuous measurement of muscle contraction during nor mortis. Longitudinal muscle strips dissected from the Peroneus Longus were suspended in a plastic tube containing liquid paraffin. Mechanical activity was transmitted to a strain-gauge transducer which is connected to a potentiometric pen-recorder. At the onset of measurement 1.2g was loaded on the muscle strip. This model was used to study the muscle response to various treatments during nor mortis. All measurements were carried out under the anaerobic condition at 17°C, except otherwise stated. 1. The present system was found to be quite useful for continuous measurement of muscle rigor course. 2. Muscle contraction under the anaerobic condition at 17°C reached a peak about 2 hours after the onset of measurement and thereafter it relaxed at a slow rate. In contrast, the aerobic condition under a high humidity resulted in a strong rigor, about three times stronger than that in the anaerobic condition. 3. Ultrasonic treatment (37, 000-47, 000Hz) at 25°C for 10 minutes resulted in a moderate muscle rigor. 4. Treatment of muscle strip with 2mM EGTA at 30°C for 30 minutes led to a relaxation of the muscle. 5. The muscle from the birds killed during anesthesia with pentobarbital sodium resulted in a slow rate of rigor, whereas the birds killed one day after hypophysectomy led to a quick muscle rigor as seen in intact controls. 6. A slight muscle rigor was observed when muscle strip was placed in a refrigerator at 0°C for 18.5 hours and thereafter temperature was kept at 17°C. (author)

  3. Standard format and content for a license application to store spent fuel and high-level radioactive waste

    International Nuclear Information System (INIS)

    1989-09-01

    Subpart B, ''License Application, Form, and Contents,'' of 10 CFR Part 72, ''Licensing Requirements for the Independent Storage of Spent Nuclear Fuel and High-Level Radioactive Waste,'' specifies the information to be covered in an application for a license to store spent fuel in an independent spent fuel storage installation (ISFSI) or to store spent fuel and high-level radioactive waste in a monitored retrievable storage facility (MRS). However, Part 72 does not specify the format to be followed in the license application. This regulatory guide suggests a format acceptable to the NRC staff for submitting the information specified in Part 72 for license application to store spent fuel in an ISFSI or to store spent fuel and high-level radioactive waste in an MRS

  4. Reconciling the Rigor-Relevance Dilemma in Intellectual Capital Research

    Science.gov (United States)

    Andriessen, Daniel

    2004-01-01

    This paper raises the issue of research methodology for intellectual capital and other types of management research by focusing on the dilemma of rigour versus relevance. The more traditional explanatory approach to research often leads to rigorous results that are not of much help to solve practical problems. This paper describes an alternative…

  5. A rigorous treatment of uncertainty quantification for Silicon damage metrics

    International Nuclear Information System (INIS)

    Griffin, P.

    2016-01-01

    These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)

  6. Effects of post mortem temperature on rigor tension, shortening and ...

    African Journals Online (AJOL)

    Fully developed rigor mortis in muscle is characterised by maximum loss of extensibility. The course of post mortem changes in ostrich muscle was studied by following isometric tension, shortening and change in pH during the first 24 h post mortem within muscle strips from the muscularis gastrocnemius, pars interna at ...

  7. Characterization of rigor mortis of longissimus dorsi and triceps ...

    African Journals Online (AJOL)

    24 h) of the longissimus dorsi (LD) and triceps brachii (TB) muscles as well as the shear force (meat tenderness) and colour were evaluated, aiming at characterizing the rigor mortis in the meat during industrial processing. Data statistic treatment demonstrated that carcass temperature and pH decreased gradually during ...

  8. Rigor, vigor, and the study of health disparities.

    Science.gov (United States)

    Adler, Nancy; Bush, Nicole R; Pantell, Matthew S

    2012-10-16

    Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents' SES on their children's health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between "rigor" and "vigor" in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities.

  9. A rigorous proof for the Landauer-Büttiker formula

    DEFF Research Database (Denmark)

    Cornean, Horia Decebal; Jensen, Arne; Moldoveanu, V.

    Recently, Avron et al. shed new light on the question of quantum transport in mesoscopic samples coupled to particle reservoirs by semi-infinite leads. They rigorously treat the case when the sample undergoes an adiabatic evolution thus generating a current through th leads, and prove the so call...

  10. Rigorous simulation: a tool to enhance decision making

    Energy Technology Data Exchange (ETDEWEB)

    Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)

    2012-07-01

    The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct

  11. Effects of Pre and Post-Rigor Marinade Injection on Some Quality Parameters of Longissimus Dorsi Muscles

    Science.gov (United States)

    Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem

    2018-01-01

    Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282

  12. Optimized determination method for trans-10-hydroxy-2-decenoic acid content in royal jelly by high-performance liquid chromatography with an internal standard.

    Science.gov (United States)

    Zhou, Jinhui; Xue, Xiaofeng; Li, Yi; Zhang, Jinzhen; Zhao, Jing

    2007-01-01

    An optimized reversed-phase high-performance liquid chromatography method was developed to detect the trans-10-hydroxy-2-decenoic acid (10-HDA) content in royal jelly cream and lyophilized powder. The sample was extracted using absolute ethanol. Chromatographic separation of 10-HDA and methyl 4-hydroxybenzoate as the internal standard was performed on a Nova-pak C18 column. The average recoveries were 95.0-99.2% (n = 5) with relative standard deviation (RSD) values of 1.3-2.1% for royal jelly cream and 98.0-100.0% (n = 5) with RSD values of 1.6-3.0% for lyophilized powder, respectively. The limits of detection and quantitation were 0.5 and 1.5 mg/kg, respectively, for both royal jelly cream and lyophilized powder. The method was validated for the determination of practical royal jelly products. The concentration of 10-HDA ranged from 1.26 to 2.21% for pure royal jelly cream samples and 3.01 to 6.19% for royal jelly lyophilized powder samples. For 30 royal jelly products, the 10-HDA content varied from not detectable to 0.98%.

  13. Recent applications of nuclear analytical methods to the certification of elemental content in NIST standard reference materials

    International Nuclear Information System (INIS)

    Greenberg, R.R.; Zeisler, R.; Mackey, E.A.

    2006-01-01

    Well-characterized, certified reference materials (CRMs) play an essential role in assuring the quality of analytical measurements. NIST has been producing CRMs, currently called NIST Standard Reference Materials (SRMs), to validate analytical measurements for nearly one hundred years. The predominant mode of certifying inorganic constituents in complex-matrix SRMs is through the use of two critically evaluated, independent analytical techniques at NIST. These techniques should have no significant sources of error in common. The use of nuclear analytical methods in combination with one of the chemically based analytical method at NIST eliminates the possibility of any significant, common error source. The inherent characteristics of the various forms of nuclear analytical methods make them extremely valuable for SRM certification. Instrumental NAA is nondestructive, which eliminates the possibility of any dissolution problems, and often provides homogeneity information. Radiochemical NAA typically provides nearly blank-free determinations of some highly important, but difficult elements at very low levels. Prompt-gamma NAA complements INAA, and provides independent determinations of some key elements. In addition, all significant uncertainty components can be evaluated for these techniques, and we believe these methods can meet all the requirements of a primary method of measurement as defined by ISO and the CCQM. NIST has certified several SRMs using INAA and RNAA as primary methods. In addition, NIST has compared measurements by INAA and PGAA with other primary methods as part of the CCQM intercomparisons of national metrology institutes. Some significant SRMs recently certified for inorganic constituents with contributions from the nuclear analytical methods include: Toxic Substances in Urine (SRM 2670a), Lake Superior Fish Tissue (SRM 1946), Air Particulate on Filter Media (SRM 2783), Inorganics in Marine Sediment (SRM 2702), Sediment for Solid Sampling (Small

  14. Rigor mortis in an unusual position: Forensic considerations.

    Science.gov (United States)

    D'Souza, Deepak H; Harish, S; Rajesh, M; Kiran, J

    2011-07-01

    We report a case in which the dead body was found with rigor mortis in an unusual position. The dead body was lying on its back with limbs raised, defying gravity. Direction of the salivary stains on the face was also defying the gravity. We opined that the scene of occurrence of crime is unlikely to be the final place where the dead body was found. The clues were revealing a homicidal offence and an attempt to destroy the evidence. The forensic use of 'rigor mortis in an unusual position' is in furthering the investigations, and the scientific confirmation of two facts - the scene of death (occurrence) is different from the scene of disposal of dead body, and time gap between the two places.

  15. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1986-01-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first- and second-order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first-order formulation satisfies the conditions of the Hille--Yosida theorem. A foundation is laid thereby within which the domains associated with the first- and second-order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  16. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1985-05-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first and second order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first order formulation satisfies the conditions of the Hille-Yosida theorem. A foundation is laid thereby within which the domains associated with the first and second order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  17. Content validation of a standardized algorithm for ostomy care.

    Science.gov (United States)

    Beitz, Janice; Gerlach, Mary; Ginsburg, Pat; Ho, Marianne; McCann, Eileen; Schafer, Vickie; Scott, Vera; Stallings, Bobbie; Turnbull, Gwen

    2010-10-01

    The number of ostomy care clinician experts is limited and the majority of ostomy care is provided by non-specialized clinicians or unskilled caregivers and family. The purpose of this study was to obtain content validation data for a new standardized algorithm for ostomy care developed by expert wound ostomy continence nurse (WOCN) clinicians. After face validity was established using overall review and suggestions from WOCN experts, 166 WOCNs self-identified as having expertise in ostomy care were surveyed online for 6 weeks in 2009. Using a cross-sectional, mixed methods study design and a 30-item instrument with a 4-point Likert-type scale, the participants were asked to quantify the degree of validity of the Ostomy Algorithm's decisions and components. Participants' open-ended comments also were thematically analyzed. Using a scale of 1 to 4, the mean score of the entire algorithm was 3.8 (4 = relevant/very relevant). The algorithm's content validity index (CVI) was 0.95 (out of 1.0). Individual component mean scores ranged from 3.59 to 3.91. Individual CVIs ranged from 0.90 to 0.98. Qualitative data analysis revealed themes of difficulty associated with algorithm formatting, especially orientation and use of the Studio Alterazioni Cutanee Stomali (Study on Peristomal Skin Lesions [SACS™ Instrument]) and the inability of algorithms to capture all individual patient attributes affecting ostomy care. Positive themes included content thoroughness and the helpful clinical photos. Suggestions were offered for algorithm improvement. Study results support the strong content validity of the algorithm and research to ascertain its construct validity and effect on care outcomes is warranted.

  18. Rigorous results on measuring the quark charge below color threshold

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1979-01-01

    Rigorous theorems are presented showing that contributions from a color nonsinglet component of the current to matrix elements of a second order electromagnetic transition are suppressed by factors inversely proportional to the energy of the color threshold. Parton models which obtain matrix elements proportional to the color average of the square of the quark charge are shown to neglect terms of the same order of magnitude as terms kept. (author)

  19. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  20. Striation Patterns of Ox Muscle in Rigor Mortis

    Science.gov (United States)

    Locker, Ronald H.

    1959-01-01

    Ox muscle in rigor mortis offers a selection of myofibrils fixed at varying degrees of contraction from sarcomere lengths of 3.7 to 0.7 µ. A study of this material by phase contrast and electron microscopy has revealed four distinct successive patterns of contraction, including besides the familiar relaxed and contracture patterns, two intermediate types (2.4 to 1.9 µ, 1.8 to 1.5 µ) not previously well described. PMID:14417790

  1. Rigorous Analysis of a Randomised Number Field Sieve

    OpenAIRE

    Lee, Jonathan; Venkatesan, Ramarathnam

    2018-01-01

    Factorisation of integers $n$ is of number theoretic and cryptographic significance. The Number Field Sieve (NFS) introduced circa 1990, is still the state of the art algorithm, but no rigorous proof that it halts or generates relationships is known. We propose and analyse an explicitly randomised variant. For each $n$, we show that these randomised variants of the NFS and Coppersmith's multiple polynomial sieve find congruences of squares in expected times matching the best-known heuristic e...

  2. Reciprocity relations in transmission electron microscopy: A rigorous derivation.

    Science.gov (United States)

    Krause, Florian F; Rosenauer, Andreas

    2017-01-01

    A concise derivation of the principle of reciprocity applied to realistic transmission electron microscopy setups is presented making use of the multislice formalism. The equivalence of images acquired in conventional and scanning mode is thereby rigorously shown. The conditions for the applicability of the found reciprocity relations is discussed. Furthermore the positions of apertures in relation to the corresponding lenses are considered, a subject which scarcely has been addressed in previous publications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.

    Science.gov (United States)

    Morse, Janice M

    2015-09-01

    Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.

  4. Salivary Fluoride level in preschool children after toothbrushing with standard and low fluoride content dentifrice, using the transversal dentifrice application technique: pilot study

    Directory of Open Access Journals (Sweden)

    Fabiana Jandre Melo

    2008-01-01

    Full Text Available Objective: To investigate the salivary fluoride concentration in pre-school children after toothbrushing with dentifrice containing standard (1100ppmF/NaF and low (500ppmF/NaF fluoride concentration, using the transversal technique of placing the product on the toothbrush. Methods: Eight children of both sexes, ranging from 4 to 9 years, and 5 years and 6 months of age, participated in the study. The experiment was divided into two phases with a weekly interval. In the first stage, the children used the standard concentration dentifrice for one week, and in the second, the low concentration product. Samples were collected at the end of each experimental stage, at the following times: Before brushing, immediately afterwards, and after 15, 30 and 45 minutes. The fluoride contents were analyzed by the microdiffusion technique. Statistical analysis was done by the analysis of variance ANOVA and Student’s-t test (p<0.05. Results: The salivary fluoride concentration was significantly higher at all times, when the standard concentration product was used. The comparison between the Halogen concentration found before bushing and immediately afterwards, showed that there was a 6.8 times increase in the standard dentifrice (0.19 x 1.29μgF/ml and in the low concentration product, an increase of 20.5 times (0.02 x 0.41μgF/ml. Conclusion: Toothbrushing with both products promoted relevant increases in the salivary fluoride concentration; however, longitudinal studies are necessary to verify the clinical result of this measurement.

  5. Standard specification for uranium oxides with a 235U content of less than 5 % for dissolution prior to conversion to nuclear-grade uranium dioxide

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 This specification covers uranium oxides, including processed byproducts or scrap material (powder, pellets, or pieces), that are intended for dissolution into uranyl nitrate solution meeting the requirements of Specification C788 prior to conversion into nuclear grade UO2 powder with a 235U content of less than 5 %. This specification defines the impurity and uranium isotope limits for such urania powders that are to be dissolved prior to processing to nuclear grade UO2 as defined in Specification C753. 1.2 This specification provides the nuclear industry with a general standard for such uranium oxide powders. It recognizes the diversity of conversion processes and the processes to which such powders are subsequently to be subjected (for instance, by solvent extraction). It is therefore anticipated that it may be necessary to include supplementary specification limits by agreement between the buyer and seller. 1.3 The scope of this specification does not comprehensively cover all provisions for prevent...

  6. Normalization of test and evaluation of biothreat detection systems: overcoming microbial air content fluctuations by using a standardized reagent bacterial mixture.

    Science.gov (United States)

    Berchebru, Laurent; Rameil, Pascal; Gaudin, Jean-Christophe; Gausson, Sabrina; Larigauderie, Guilhem; Pujol, Céline; Morel, Yannick; Ramisse, Vincent

    2014-10-01

    Test and evaluation of engineered biothreat agent detection systems ("biodetectors") are a challenging task for government agencies and industries involved in biosecurity and biodefense programs. In addition to user friendly features, biodetectors need to perform both highly sensitive and specific detection, and must not produce excessive false alerts. In fact, the atmosphere displays a number of variables such as airborne bacterial content that can interfere with the detection process, thus impeding comparative tests when carried out at different times or places. To overcome these bacterial air content fluctuations, a standardized reagent bacterial mixture (SRBM), consisting in a collection of selected cultivable environmental species that are prevalent in temperate climate bioaerosols, was designed to generate a stable, reproducible, and easy to use surrogate of bioaerosol sample. The rationale, design, and production process are reported. The results showed that 8.59; CI 95%: 8.46-8.72 log cfu distributed into vials underwent a 0.95; CI 95%: 0.65-1.26 log viability decay after dehydration and subsequent reconstitution, thus advantageously mimicking a natural bioaerosol sample which is typically composed of cultivable and uncultivable particles. Dehydrated SRBM was stable for more than 12months at 4°C and allowed the reconstitution of a dead/live cells aqueous suspension that is stable for 96h at +4°C, according to plate counts. Specific detection of a simulating biothreat agent (e.g. Bacillus atrophaeus) by immuno-magnetic or PCR assays did not display any significant loss of sensitivity, false negative or positive results in the presence of SRBM. This work provides guidance on testing and evaluating detection devices, and may contribute to the establishment of suitable standards and normalized procedures. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Acceptable standard format and content for the fundamental nuclear material control (FNMC) plan required for low-enriched uranium facilities. Revision 2

    International Nuclear Information System (INIS)

    Joy, D.R.

    1995-12-01

    This report documents a standard format suggested by the NRC for use in preparing fundamental nuclear material control (FNMC) plans as required by the Low Enriched Uranium Reform Amendments (10CFR 74.31). This report also describes the necessary contents of a comprehensive plan and provides example acceptance criteria which are intended to communicate acceptable means of achieving the performance capabilities of the Reform Amendments. By using the suggested format, the licensee or applicant will minimize administrative problems associated with the submittal, review and approval of the FNMC plan. Preparation of the plan in accordance with this format Will assist the NRC in evaluating the plan and in standardizing the review and licensing process. However, conformance with this guidance is not required by the NRC. A license applicant who employs a format that provides a equal level of completeness and detail may use their own format. This document is also intended for providing guidance to licensees when making revisions to their FNMC plan

  8. New rigorous asymptotic theorems for inverse scattering amplitudes

    International Nuclear Information System (INIS)

    Lomsadze, Sh.Yu.; Lomsadze, Yu.M.

    1984-01-01

    The rigorous asymptotic theorems both of integral and local types obtained earlier and establishing logarithmic and in some cases even power correlations aetdeen the real and imaginary parts of scattering amplitudes Fsub(+-) are extended to the inverse amplitudes 1/Fsub(+-). One also succeeds in establishing power correlations of a new type between the real and imaginary parts, both for the amplitudes themselves and for the inverse ones. All the obtained assertions are convenient to be tested in high energy experiments when the amplitudes show asymptotic behaviour

  9. Prediction of digestible and metabolizable energy content and standardized ileal amino Acid digestibility in wheat shorts and red dog for growing pigs.

    Science.gov (United States)

    Huang, Q; Piao, X S; Ren, P; Li, D F

    2012-12-01

    Two experiments were conducted to evaluate the effects of chemical composition of wheat shorts and red dog on energy and amino acid digestibility in growing pigs and to establish prediction models to estimate their digestible (DE) and metabolizable (ME) energy content and as well as their standardized ileal digestible (SID) amino acid content. For Exp. 1, sixteen diets were fed to thirty-two growing pigs according to a completely randomized design during three successive periods. The basal diet was based on corn and soybean meal while the other fifteen diets contained 28.8% wheat shorts (N = 7) or red dog (N = 8), added at the expense of corn and soybean meal. Over the three periods, each diet was fed to six pigs with each diet being fed to two pigs during each period. The apparent total tract digestibility (ATTD) of energy in wheat shorts and red dog averaged 75.1 and 87.9%. The DE values of wheat shorts and red dog averaged 13.8 MJ/kg (range 13.1 to 15.0 MJ/kg) and 15.1 MJ/kg (range 13.3 to 16.6 MJ/kg) of dry matter, respectively. For Exp. 2, twelve growing pigs were allotted to two 6×6 Latin Square Designs with six periods. Ten of the diets were formulated based on 60% wheat shorts or red dog and the remaining two diets were nitrogen-free diets based on cornstarch and sucrose. Chromic oxide (0.3%) was used as an indigestible marker in all diets. There were no differences (p>0.05) in SID values for the amino acids in wheat shorts and red dog except for lysine and methionine. Apparent ileal digestibility (AID) and SID values for lysine in different sources of wheat shorts or red dog, which averaged 78.1 and 87.8%, showed more variation than either methionine or tryptophan. A stepwise regression was performed to establish DE, ME and amino acid digestibility prediction models. Data indicated that fiber content and amino acid concentrations were good indicators to predict energy values and amino acid digestibility, respectively. The present study confirms the large

  10. Post mortem rigor development in the Egyptian goose (Alopochen aegyptiacus) breast muscle (pectoralis): factors which may affect the tenderness.

    Science.gov (United States)

    Geldenhuys, Greta; Muller, Nina; Frylinck, Lorinda; Hoffman, Louwrens C

    2016-01-15

    Baseline research on the toughness of Egyptian goose meat is required. This study therefore investigates the post mortem pH and temperature decline (15 min-4 h 15 min post mortem) in the pectoralis muscle (breast portion) of this gamebird species. It also explores the enzyme activity of the Ca(2+)-dependent protease (calpain system) and the lysosomal cathepsins during the rigor mortis period. No differences were found for any of the variables between genders. The pH decline in the pectoralis muscle occurs quite rapidly (c = -0.806; ultimate pH ∼ 5.86) compared with other species and it is speculated that the high rigor temperature (>20 °C) may contribute to the increased toughness. No calpain I was found in Egyptian goose meat and the µ/m-calpain activity remained constant during the rigor period, while a decrease in calpastatin activity was observed. The cathepsin B, B & L and H activity increased over the rigor period. Further research into the connective tissue content and myofibrillar breakdown during aging is required in order to know if the proteolytic enzymes do in actual fact contribute to tenderisation. © 2015 Society of Chemical Industry.

  11. Sonoelasticity to monitor mechanical changes during rigor and ageing.

    Science.gov (United States)

    Ayadi, A; Culioli, J; Abouelkaram, S

    2007-06-01

    We propose the use of sonoelasticity as a non-destructive method to monitor changes in the resistance of muscle fibres, unaffected by connective tissue. Vibrations were applied at low frequency to induce oscillations in soft tissues and an ultrasound transducer was used to detect the motions. The experiments were carried out on the M. biceps femoris muscles of three beef cattle. In addition to the sonoelasticity measurements, the changes in meat during rigor and ageing were followed by measurements of both the mechanical resistance of myofibres and pH. The variations of mechanical resistance and pH were compared to those of the sonoelastic variables (velocity and attenuation) at two frequencies. The relationships between pH and velocity or attenuation and between the velocity or attenuation and the stress at 20% deformation were highly correlated. We concluded that sonoelasticity is a non-destructive method that can be used to monitor mechanical changes in muscle fibers during rigor-mortis and ageing.

  12. A rigorous test for a new conceptual model for collisions

    International Nuclear Information System (INIS)

    Peixoto, E.M.A.; Mu-Tao, L.

    1979-01-01

    A rigorous theoretical foundation for the previously proposed model is formulated and applied to electron scattering by H 2 in the gas phase. An rigorous treatment of the interaction potential between the incident electron and the Hydrogen molecule is carried out to calculate Differential Cross Sections for 1 KeV electrons, using Glauber's approximation Wang's molecular wave function for the ground electronic state of H 2 . Moreover, it is shown for the first time that, when adequately done, the omission of two center terms does not adversely influence the results of molecular calculations. It is shown that the new model is far superior to the Independent Atom Model (or Independent Particle Model). The accuracy and simplicity of the new model suggest that it may be fruitfully applied to the description of other collision phenomena (e.g., in molecular beam experiments and nuclear physics). A new techniques is presented for calculations involving two center integrals within the frame work of the Glauber's approximation for scattering. (Author) [pt

  13. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Science.gov (United States)

    Gallego, Sergi; Neipp, Cristian; Estepa, Luis A.; Ortuño, Manuel; Márquez, Andrés; Francés, Jorge; Pascual, Inmaculada; Beléndez, Augusto

    2012-01-01

    There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW) or the Rigorous Coupled Wave theory (RCW). The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  14. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Directory of Open Access Journals (Sweden)

    Augusto Beléndez

    2012-08-01

    Full Text Available There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW or the Rigorous Coupled Wave theory (RCW. The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  15. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  16. Experimental evaluation of rigor mortis IX. The influence of the breaking (mechanical solution) on the development of rigor mortis.

    Science.gov (United States)

    Krompecher, Thomas; Gilles, André; Brandt-Casadevall, Conception; Mangin, Patrice

    2008-04-07

    Objective measurements were carried out to study the possible re-establishment of rigor mortis on rats after "breaking" (mechanical solution). Our experiments showed that: *Cadaveric rigidity can re-establish after breaking. *A significant rigidity can reappear if the breaking occurs before the process is complete. *Rigidity will be considerably weaker after the breaking. *The time course of the intensity does not change in comparison to the controls: --the re-establishment begins immediately after the breaking; --maximal values are reached at the same time as in the controls; --the course of the resolution is the same as in the controls.

  17. Biomedical text mining for research rigor and integrity: tasks, challenges, directions.

    Science.gov (United States)

    Kilicoglu, Halil

    2017-06-13

    An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.

  18. Immense random colocalization, revealed by automated high content image cytometry, seriously questions FISH as gold standard for detecting EML4-ALK fusion.

    Science.gov (United States)

    Smuk, Gábor; Tornóczky, Tamás; Pajor, László; Chudoba, Ilse; Kajtár, Béla; Sárosi, Veronika; Pajor, Gábor

    2018-05-19

    EML4-ALK gene fusion (inv2(p21p23)) of non-small cell lung cancer (NSCLC) predisposes to tyrosine kinase inhibitor treatment. One of the gold standard diagnostics is the dual color (DC) break-apart (BA) FISH technique, however, the unusual closeness of the involved genes has been suggested to raise likelihood of random co-localization (RCL) of signals. Although this is suspected to decrease sensitivity (often to as low as 40-70%), the exact level and effect of RCL has not been revealed thus far. Signal distances were analyzed to the 0.1 µm precision in more than 25,000 nuclei, via automated high content-image cytometry. Negative and positive controls were created using conventional DC BA-, and inv2(p21p23) mimicking probe-sets, respectively. Average distance between red and green signals was 9.72 pixels (px) (±5.14px) and 3.28px (±2.44px), in positives and negatives, respectively; overlap in distribution being 41%. Specificity and sensitivity of correctly determining ALK status was 97% and 29%, respectively. When investigating inv2(p21p23) with DC BA FISH, specificity is high, but seven out of ten aberrant nuclei are inevitably falsely classified as negative, due to the extreme level of RCL. Together with genetic heterogeneity and dilution effect of non-tumor cells in NSCLC, this immense analytical false negativity is the primary cause behind the often described low diagnostic sensitivity. These results convincingly suggest that if FISH is to remain a gold standard for detecting the therapy relevant inv(2), either a modified evaluation protocol, or a more reliable probe-design should be considered than the current DC BA one. © 2018 International Society for Advancement of Cytometry. © 2018 International Society for Advancement of Cytometry.

  19. Reframing Rigor: A Modern Look at Challenge and Support in Higher Education

    Science.gov (United States)

    Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.

    2018-01-01

    This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.

  20. Rigorous force field optimization principles based on statistical distance minimization

    Energy Technology Data Exchange (ETDEWEB)

    Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  1. From everyday communicative figurations to rigorous audience news repertoires

    DEFF Research Database (Denmark)

    Kobbernagel, Christian; Schrøder, Kim Christian

    2016-01-01

    In the last couple of decades there has been an unprecedented explosion of news media platforms and formats, as a succession of digital and social media have joined the ranks of legacy media. We live in a ‘hybrid media system’ (Chadwick, 2013), in which people build their cross-media news...... repertoires from the ensemble of old and new media available. This article presents an innovative mixed-method approach with considerable explanatory power to the exploration of patterns of news media consumption. This approach tailors Q-methodology in the direction of a qualitative study of news consumption......, in which a card sorting exercise serves to translate the participants’ news media preferences into a form that enables the researcher to undertake a rigorous factor-analytical construction of their news consumption repertoires. This interpretive, factor-analytical procedure, which results in the building...

  2. Fast and Rigorous Assignment Algorithm Multiple Preference and Calculation

    Directory of Open Access Journals (Sweden)

    Ümit Çiftçi

    2010-03-01

    Full Text Available The goal of paper is to develop an algorithm that evaluates students then places them depending on their desired choices according to dependant preferences. The developed algorithm is also used to implement software. The success and accuracy of the software as well as the algorithm are tested by applying it to ability test at Beykent University. This ability test is repeated several times in order to fill all available places at Fine Art Faculty departments in every academic year. It has been shown that this algorithm is very fast and rigorous after application of 2008-2009 and 2009-20010 academic years.Key Words: Assignment algorithm, student placement, ability test

  3. Student’s rigorous mathematical thinking based on cognitive style

    Science.gov (United States)

    Fitriyani, H.; Khasanah, U.

    2017-12-01

    The purpose of this research was to determine the rigorous mathematical thinking (RMT) of mathematics education students in solving math problems in terms of reflective and impulsive cognitive styles. The research used descriptive qualitative approach. Subjects in this research were 4 students of the reflective and impulsive cognitive style which was each consisting male and female subjects. Data collection techniques used problem-solving test and interview. Analysis of research data used Miles and Huberman model that was reduction of data, presentation of data, and conclusion. The results showed that impulsive male subjects used three levels of the cognitive function required for RMT that were qualitative thinking, quantitative thinking with precision, and relational thinking completely while the other three subjects were only able to use cognitive function at qualitative thinking level of RMT. Therefore the subject of impulsive male has a better RMT ability than the other three research subjects.

  4. Rigorous Quantum Field Theory A Festschrift for Jacques Bros

    CERN Document Server

    Monvel, Anne Boutet; Iagolnitzer, Daniel; Moschella, Ugo

    2007-01-01

    Jacques Bros has greatly advanced our present understanding of rigorous quantum field theory through numerous fundamental contributions. This book arose from an international symposium held in honour of Jacques Bros on the occasion of his 70th birthday, at the Department of Theoretical Physics of the CEA in Saclay, France. The impact of the work of Jacques Bros is evident in several articles in this book. Quantum fields are regarded as genuine mathematical objects, whose various properties and relevant physical interpretations must be studied in a well-defined mathematical framework. The key topics in this volume include analytic structures of Quantum Field Theory (QFT), renormalization group methods, gauge QFT, stability properties and extension of the axiomatic framework, QFT on models of curved spacetimes, QFT on noncommutative Minkowski spacetime. Contributors: D. Bahns, M. Bertola, R. Brunetti, D. Buchholz, A. Connes, F. Corbetta, S. Doplicher, M. Dubois-Violette, M. Dütsch, H. Epstein, C.J. Fewster, K....

  5. Desarrollo constitucional, legal y jurisprudencia del principio de rigor subsidiario

    Directory of Open Access Journals (Sweden)

    Germán Eduardo Cifuentes Sandoval

    2013-09-01

    Full Text Available In colombia the environment state administration is in charge of environmental national system, SINA, SINA is made up of states entities that coexist beneath a mixed organization of centralization and decentralization. SINA decentralization express itself in a administrative and territorial level, and is waited that entities that function under this structure act in a coordinated way in order to reach suggested objectives in the environmental national politicy. To achieve the coordinated environmental administration through entities that define the SINA, the environmental legislation of Colombia has include three basic principles: 1. The principle of “armorial regional” 2. The principle of “gradationnormative” 3. The principle of “rigorsubsidiaries”. These principles belong to the article 63, law 99 of 1933, and even in the case of the two first, it is possible to find equivalents in other norms that integrate the Colombian legal system, it does not happen in that way with the “ rigor subsidiaries” because its elements are uniques of the environmental normativity and do not seem to be similar to those that make part of the principle of “ subsidiaridad” present in the article 288 of the politic constitution. The “ rigor subsidiaries” give to decentralizates entities certain type of special ability to modify the current environmental legislation to defend the local ecological patrimony. It is an administrative ability with a foundation in the decentralization autonomy that allows to take place of the reglamentary denied of the legislative power with the condition that the new normativity be more demanding that the one that belongs to the central level

  6. Rigorous patient-prosthesis matching of Perimount Magna aortic bioprosthesis.

    Science.gov (United States)

    Nakamura, Hiromasa; Yamaguchi, Hiroki; Takagaki, Masami; Kadowaki, Tasuku; Nakao, Tatsuya; Amano, Atsushi

    2015-03-01

    Severe patient-prosthesis mismatch, defined as effective orifice area index ≤0.65 cm(2) m(-2), has demonstrated poor long-term survival after aortic valve replacement. Reported rates of severe mismatch involving the Perimount Magna aortic bioprosthesis range from 4% to 20% in patients with a small annulus. Between June 2008 and August 2011, 251 patients (mean age 70.5 ± 10.2 years; mean body surface area 1.55 ± 0.19 m(2)) underwent aortic valve replacement with a Perimount Magna bioprosthesis, with or without concomitant procedures. We performed our procedure with rigorous patient-prosthesis matching to implant a valve appropriately sized to each patient, and carried out annular enlargement when a 19-mm valve did not fit. The bioprosthetic performance was evaluated by transthoracic echocardiography predischarge and at 1 and 2 years after surgery. Overall hospital mortality was 1.6%. Only 5 (2.0%) patients required annular enlargement. The mean follow-up period was 19.1 ± 10.7 months with a 98.4% completion rate. Predischarge data showed a mean effective orifice area index of 1.21 ± 0.20 cm(2) m(-2). Moderate mismatch, defined as effective orifice area index ≤0.85 cm(2) m(-2), developed in 4 (1.6%) patients. None developed severe mismatch. Data at 1 and 2 years showed only two cases of moderate mismatch; neither was severe. Rigorous patient-prosthesis matching maximized the performance of the Perimount Magna, and no severe mismatch resulted in this Japanese population of aortic valve replacement patients. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  7. A Generalized Method for the Comparable and Rigorous Calculation of the Polytropic Efficiencies of Turbocompressors

    Science.gov (United States)

    Dimitrakopoulos, Panagiotis

    2018-03-01

    The calculation of polytropic efficiencies is a very important task, especially during the development of new compression units, like compressor impellers, stages and stage groups. Such calculations are also crucial for the determination of the performance of a whole compressor. As processors and computational capacities have substantially been improved in the last years, the need for a new, rigorous, robust, accurate and at the same time standardized method merged, regarding the computation of the polytropic efficiencies, especially based on thermodynamics of real gases. The proposed method is based on the rigorous definition of the polytropic efficiency. The input consists of pressure and temperature values at the end points of the compression path (suction and discharge), for a given working fluid. The average relative error for the studied cases was 0.536 %. Thus, this high-accuracy method is proposed for efficiency calculations related with turbocompressors and their compression units, especially when they are operating at high power levels, for example in jet engines and high-power plants.

  8. Compiling standardized information from clinical practice: using content analysis and ICF Linking Rules in a goal-oriented youth rehabilitation program.

    Science.gov (United States)

    Lustenberger, Nadia A; Prodinger, Birgit; Dorjbal, Delgerjargal; Rubinelli, Sara; Schmitt, Klaus; Scheel-Sailer, Anke

    2017-09-23

    To illustrate how routinely written narrative admission and discharge reports of a rehabilitation program for eight youths with chronic neurological health conditions can be transformed to the International Classification of Functioning, Disability and Health. First, a qualitative content analysis was conducted by building meaningful units with text segments assigned of the reports to the five elements of the Rehab-Cycle ® : goal; assessment; assignment; intervention; evaluation. Second, the meaningful units were then linked to the ICF using the refined ICF Linking Rules. With the first step of transformation, the emphasis of the narrative reports changed to a process oriented interdisciplinary layout, revealing three thematic blocks of goals: mobility, self-care, mental, and social functions. The linked 95 unique ICF codes could be grouped in clinically meaningful goal-centered ICF codes. Between the two independent linkers, the agreement rate was improved after complementing the rules with additional agreements. The ICF Linking Rules can be used to compile standardized health information from narrative reports if prior structured. The process requires time and expertise. To implement the ICF into common practice, the findings provide the starting point for reporting rehabilitation that builds upon existing practice and adheres to international standards. Implications for Rehabilitation This study provides evidence that routinely collected health information from rehabilitation practice can be transformed to the International Classification of Functioning, Disability and Health by using the "ICF Linking Rules", however, this requires time and expertise. The Rehab-Cycle ® , including assessments, assignments, goal setting, interventions and goal evaluation, serves as feasible framework for structuring this rehabilitation program and ensures that the complexity of local practice is appropriately reflected. The refined "ICF Linking Rules" lead to a standardized

  9. Failure, The Next Generation: Why Rigorous Standards are not Sufficient to Improve Science Learning

    Directory of Open Access Journals (Sweden)

    Mary Antony Bair

    2014-11-01

    Full Text Available Although many states in the United States are adopting policies that require all students to complete college-preparatory science classes to graduate from high school, such policies have not always led to improved student outcomes. There is much speculation about the cause of the dismal results, but there is scant research on the processes by which the policies are being implemented at the school level, especially in schools that enroll large numbers of historically non-college-bound students. To address this gap in the literature, we conducted a four-year ethnographic case study of policy implementation at one racially and socioeconomically diverse high school in Michigan. Guided by the structuration theory of Anthony Giddens (1984, we gathered and analyzed information from interviews with administrators and science teachers, observations of science classes, and relevant curriculum and policy documents. Our findings reveal the processes and rationales by which a state policy mandating three years of college-preparatory science for all students was implemented at the school. Four years after the policy was implemented, there was little improvement in science outcomes. The main reason for this, we found, was the lack of correspondence between the state policy and local policies developed in response to that state policy.

  10. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    Science.gov (United States)

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Memory sparing, fast scattering formalism for rigorous diffraction modeling

    Science.gov (United States)

    Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.

    2017-07-01

    The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.

  12. Rigorous Results for the Distribution of Money on Connected Graphs

    Science.gov (United States)

    Lanchier, Nicolas; Reed, Stephanie

    2018-05-01

    This paper is concerned with general spatially explicit versions of three stochastic models for the dynamics of money that have been introduced and studied numerically by statistical physicists: the uniform reshuffling model, the immediate exchange model and the model with saving propensity. All three models consist of systems of economical agents that consecutively engage in pairwise monetary transactions. Computer simulations performed in the physics literature suggest that, when the number of agents and the average amount of money per agent are large, the limiting distribution of money as time goes to infinity approaches the exponential distribution for the first model, the gamma distribution with shape parameter two for the second model and a distribution similar but not exactly equal to a gamma distribution whose shape parameter depends on the saving propensity for the third model. The main objective of this paper is to give rigorous proofs of these conjectures and also extend these conjectures to generalizations of the first two models and a variant of the third model that include local rather than global interactions, i.e., instead of choosing the two interacting agents uniformly at random from the system, the agents are located on the vertex set of a general connected graph and can only interact with their neighbors.

  13. Rigorous vector wave propagation for arbitrary flat media

    Science.gov (United States)

    Bos, Steven P.; Haffert, Sebastiaan Y.; Keller, Christoph U.

    2017-08-01

    Precise modelling of the (off-axis) point spread function (PSF) to identify geometrical and polarization aberrations is important for many optical systems. In order to characterise the PSF of the system in all Stokes parameters, an end-to-end simulation of the system has to be performed in which Maxwell's equations are rigorously solved. We present the first results of a python code that we are developing to perform multiscale end-to-end wave propagation simulations that include all relevant physics. Currently we can handle plane-parallel near- and far-field vector diffraction effects of propagating waves in homogeneous isotropic and anisotropic materials, refraction and reflection of flat parallel surfaces, interference effects in thin films and unpolarized light. We show that the code has a numerical precision on the order of 10-16 for non-absorbing isotropic and anisotropic materials. For absorbing materials the precision is on the order of 10-8. The capabilities of the code are demonstrated by simulating a converging beam reflecting from a flat aluminium mirror at normal incidence.

  14. Dynamics of harmonically-confined systems: Some rigorous results

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Zhigang, E-mail: zwu@physics.queensu.ca; Zaremba, Eugene, E-mail: zaremba@sparky.phy.queensu.ca

    2014-03-15

    In this paper we consider the dynamics of harmonically-confined atomic gases. We present various general results which are independent of particle statistics, interatomic interactions and dimensionality. Of particular interest is the response of the system to external perturbations which can be either static or dynamic in nature. We prove an extended Harmonic Potential Theorem which is useful in determining the damping of the centre of mass motion when the system is prepared initially in a highly nonequilibrium state. We also study the response of the gas to a dynamic external potential whose position is made to oscillate sinusoidally in a given direction. We show in this case that either the energy absorption rate or the centre of mass dynamics can serve as a probe of the optical conductivity of the system. -- Highlights: •We derive various rigorous results on the dynamics of harmonically-confined atomic gases. •We derive an extension of the Harmonic Potential Theorem. •We demonstrate the link between the energy absorption rate in a harmonically-confined system and the optical conductivity.

  15. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    Science.gov (United States)

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  16. Rigorous derivation of porous-media phase-field equations

    Science.gov (United States)

    Schmuck, Markus; Kalliadasis, Serafim

    2017-11-01

    The evolution of interfaces in Complex heterogeneous Multiphase Systems (CheMSs) plays a fundamental role in a wide range of scientific fields such as thermodynamic modelling of phase transitions, materials science, or as a computational tool for interfacial flow studies or material design. Here, we focus on phase-field equations in CheMSs such as porous media. To the best of our knowledge, we present the first rigorous derivation of error estimates for fourth order, upscaled, and nonlinear evolution equations. For CheMs with heterogeneity ɛ, we obtain the convergence rate ɛ 1 / 4 , which governs the error between the solution of the new upscaled formulation and the solution of the microscopic phase-field problem. This error behaviour has recently been validated computationally in. Due to the wide range of application of phase-field equations, we expect this upscaled formulation to allow for new modelling, analytic, and computational perspectives for interfacial transport and phase transformations in CheMSs. This work was supported by EPSRC, UK, through Grant Nos. EP/H034587/1, EP/L027186/1, EP/L025159/1, EP/L020564/1, EP/K008595/1, and EP/P011713/1 and from ERC via Advanced Grant No. 247031.

  17. Rigorous time slicing approach to Feynman path integrals

    CERN Document Server

    Fujiwara, Daisuke

    2017-01-01

    This book proves that Feynman's original definition of the path integral actually converges to the fundamental solution of the Schrödinger equation at least in the short term if the potential is differentiable sufficiently many times and its derivatives of order equal to or higher than two are bounded. The semi-classical asymptotic formula up to the second term of the fundamental solution is also proved by a method different from that of Birkhoff. A bound of the remainder term is also proved. The Feynman path integral is a method of quantization using the Lagrangian function, whereas Schrödinger's quantization uses the Hamiltonian function. These two methods are believed to be equivalent. But equivalence is not fully proved mathematically, because, compared with Schrödinger's method, there is still much to be done concerning rigorous mathematical treatment of Feynman's method. Feynman himself defined a path integral as the limit of a sequence of integrals over finite-dimensional spaces which is obtained by...

  18. Rigorous high-precision enclosures of fixed points and their invariant manifolds

    Science.gov (United States)

    Wittig, Alexander N.

    The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by

  19. Assessment of the Methodological Rigor of Case Studies in the Field of Management Accounting Published in Journals in Brazil

    Directory of Open Access Journals (Sweden)

    Kelly Cristina Mucio Marques

    2015-04-01

    Full Text Available This study aims to assess the methodological rigor of case studies in management accounting published in Brazilian journals. The study is descriptive. The data were collected using documentary research and content analysis, and 180 papers published from 2008 to 2012 in accounting journals rated as A2, B1, and B2 that were classified as case studies were selected. Based on the literature, we established a set of 15 criteria that we expected to be identified (either explicitly or implicitly in the case studies to classify those case studies as appropriate from the standpoint of methodological rigor. These criteria were partially met by the papers analyzed. The aspects less aligned with those proposed in the literature were the following: little emphasis on justifying the need to understand phenomena in context; lack of explanation of the reason for choosing the case study strategy; the predominant use of questions that do not enable deeper analysis; many studies based on only one source of evidence; little use of data and information triangulation; little emphasis on the data collection method; a high number of cases in which confusion between case study as a research strategy and as data collection method were detected; a low number of papers reporting the method of data analysis; few reports on a study's contributions; and a minority highlighting the issues requiring further research. In conclusion, the method used to apply case studies to management accounting must be improved because few studies showed rigorous application of the procedures that this strategy requires.

  20. Stochastic Geometry and Quantum Gravity: Some Rigorous Results

    Science.gov (United States)

    Zessin, H.

    The aim of these lectures is a short introduction into some recent developments in stochastic geometry which have one of its origins in simplicial gravity theory (see Regge Nuovo Cimento 19: 558-571, 1961). The aim is to define and construct rigorously point processes on spaces of Euclidean simplices in such a way that the configurations of these simplices are simplicial complexes. The main interest then is concentrated on their curvature properties. We illustrate certain basic ideas from a mathematical point of view. An excellent representation of this area can be found in Schneider and Weil (Stochastic and Integral Geometry, Springer, Berlin, 2008. German edition: Stochastische Geometrie, Teubner, 2000). In Ambjørn et al. (Quantum Geometry Cambridge University Press, Cambridge, 1997) you find a beautiful account from the physical point of view. More recent developments in this direction can be found in Ambjørn et al. ("Quantum gravity as sum over spacetimes", Lect. Notes Phys. 807. Springer, Heidelberg, 2010). After an informal axiomatic introduction into the conceptual foundations of Regge's approach the first lecture recalls the concepts and notations used. It presents the fundamental zero-infinity law of stochastic geometry and the construction of cluster processes based on it. The second lecture presents the main mathematical object, i.e. Poisson-Delaunay surfaces possessing an intrinsic random metric structure. The third and fourth lectures discuss their ergodic behaviour and present the two-dimensional Regge model of pure simplicial quantum gravity. We terminate with the formulation of basic open problems. Proofs are given in detail only in a few cases. In general the main ideas are developed. Sufficiently complete references are given.

  1. RIGOROUS GEOREFERENCING OF ALSAT-2A PANCHROMATIC AND MULTISPECTRAL IMAGERY

    Directory of Open Access Journals (Sweden)

    I. Boukerch

    2013-04-01

    Full Text Available The exploitation of the full geometric capabilities of the High-Resolution Satellite Imagery (HRSI, require the development of an appropriate sensor orientation model. Several authors studied this problem; generally we have two categories of geometric models: physical and empirical models. Based on the analysis of the metadata provided with ALSAT-2A, a rigorous pushbroom camera model can be developed. This model has been successfully applied to many very high resolution imagery systems. The relation between the image and ground coordinates by the time dependant collinearity involving many coordinates systems has been tested. The interior orientation parameters must be integrated in the model, the interior parameters can be estimated from the viewing angles corresponding to the pointing directions of any detector, these values are derived from cubic polynomials provided in the metadata. The developed model integrates all the necessary elements with 33 unknown. All the approximate values of the 33 unknowns parameters may be derived from the informations contained in the metadata files provided with the imagery technical specifications or they are simply fixed to zero, so the condition equation is linearized and solved using SVD in a least square sense in order to correct the initial values using a suitable number of well-distributed GCPs. Using Alsat-2A images over the town of Toulouse in the south west of France, three experiments are done. The first is about 2D accuracy analysis using several sets of parameters. The second is about GCPs number and distribution. The third experiment is about georeferencing multispectral image by applying the model calculated from panchromatic image.

  2. A rigorous derivation of gravitational self-force

    International Nuclear Information System (INIS)

    Gralla, Samuel E; Wald, Robert M

    2008-01-01

    There is general agreement that the MiSaTaQuWa equations should describe the motion of a 'small body' in general relativity, taking into account the leading order self-force effects. However, previous derivations of these equations have made a number of ad hoc assumptions and/or contain a number of unsatisfactory features. For example, all previous derivations have invoked, without proper justification, the step of 'Lorenz gauge relaxation', wherein the linearized Einstein equation is written in the form appropriate to the Lorenz gauge, but the Lorenz gauge condition is then not imposed-thereby making the resulting equations for the metric perturbation inequivalent to the linearized Einstein equations. (Such a 'relaxation' of the linearized Einstein equations is essential in order to avoid the conclusion that 'point particles' move on geodesics.) In this paper, we analyze the issue of 'particle motion' in general relativity in a systematic and rigorous way by considering a one-parameter family of metrics, g ab (λ), corresponding to having a body (or black hole) that is 'scaled down' to zero size and mass in an appropriate manner. We prove that the limiting worldline of such a one-parameter family must be a geodesic of the background metric, g ab (λ = 0). Gravitational self-force-as well as the force due to coupling of the spin of the body to curvature-then arises as a first-order perturbative correction in λ to this worldline. No assumptions are made in our analysis apart from the smoothness and limit properties of the one-parameter family of metrics, g ab (λ). Our approach should provide a framework for systematically calculating higher order corrections to gravitational self-force, including higher multipole effects, although we do not attempt to go beyond first-order calculations here. The status of the MiSaTaQuWa equations is explained

  3. Experimental evaluation of rigor mortis. III. Comparative study of the evolution of rigor mortis in different sized muscle groups in rats.

    Science.gov (United States)

    Krompecher, T; Fryc, O

    1978-01-01

    The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.

  4. How Individual Scholars Can Reduce the Rigor-Relevance Gap in Management Research

    OpenAIRE

    Wolf, Joachim; Rosenberg, Timo

    2012-01-01

    This paper discusses a number of avenues management scholars could follow to reduce the existing gap between scientific rigor and practical relevance without relativizing the importance of the first goal dimension. Such changes are necessary because many management studies do not fully exploit the possibilities to increase their practical relevance while maintaining scientific rigor. We argue that this rigor-relevance gap is not only the consequence of the currently prevailing institutional c...

  5. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    Science.gov (United States)

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  6. A rigorous semantics for BPMN 2.0 process diagrams

    CERN Document Server

    Kossak, Felix; Geist, Verena; Kubovy, Jan; Natschläger, Christine; Ziebermayr, Thomas; Kopetzky, Theodorich; Freudenthaler, Bernhard; Schewe, Klaus-Dieter

    2015-01-01

    This book provides the most complete formal specification of the semantics of the Business Process Model and Notation 2.0 standard (BPMN) available to date, in a style that is easily understandable for a wide range of readers - not only for experts in formal methods, but e.g. also for developers of modeling tools, software architects, or graduate students specializing in business process management. BPMN - issued by the Object Management Group - is a widely used standard for business process modeling. However, major drawbacks of BPMN include its limited support for organizational modeling, i

  7. Creating standards: Creating illusions?

    DEFF Research Database (Denmark)

    Linneberg, Mai Skjøtt

    written standards may open up for the creation of illusions. These are created when written standards' content is not in accordance with the perception standard adopters and standard users have of the specific practice phenomenon's content. This general theoretical argument is exemplified by the specific...

  8. Cumulative prospect theory and mean variance analysis. A rigorous comparison

    OpenAIRE

    Hens, Thorsten; Mayer, Janos

    2012-01-01

    We compare asset allocations derived for cumulative prospect theory(CPT) based on two different methods: Maximizing CPT along the mean–variance efficient frontier and maximizing it without that restriction. We find that with normally distributed returns the difference is negligible. However, using standard asset allocation data of pension funds the difference is considerable. Moreover, with derivatives like call options the restriction to the mean-variance efficient frontier results in a siza...

  9. Trends in Methodological Rigor in Intervention Research Published in School Psychology Journals

    Science.gov (United States)

    Burns, Matthew K.; Klingbeil, David A.; Ysseldyke, James E.; Petersen-Brown, Shawna

    2012-01-01

    Methodological rigor in intervention research is important for documenting evidence-based practices and has been a recent focus in legislation, including the No Child Left Behind Act. The current study examined the methodological rigor of intervention research in four school psychology journals since the 1960s. Intervention research has increased…

  10. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    Science.gov (United States)

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (prigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  11. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    Science.gov (United States)

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (psalting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  12. Rigorous bounds on the free energy of electron-phonon models

    NARCIS (Netherlands)

    Raedt, Hans De; Michielsen, Kristel

    1997-01-01

    We present a collection of rigorous upper and lower bounds to the free energy of electron-phonon models with linear electron-phonon interaction. These bounds are used to compare different variational approaches. It is shown rigorously that the ground states corresponding to the sharpest bounds do

  13. The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools

    Science.gov (United States)

    Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia

    2016-01-01

    Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…

  14. Moving beyond Data Transcription: Rigor as Issue in Representation of Digital Literacies

    Science.gov (United States)

    Hagood, Margaret Carmody; Skinner, Emily Neil

    2015-01-01

    Rigor in qualitative research has been based upon criteria of credibility, dependability, confirmability, and transferability. Drawing upon articles published during our editorship of the "Journal of Adolescent & Adult Literacy," we illustrate how the use of digital data in research study reporting may enhance these areas of rigor,…

  15. EarthLabs Modules: Engaging Students In Extended, Rigorous Investigations Of The Ocean, Climate and Weather

    Science.gov (United States)

    Manley, J.; Chegwidden, D.; Mote, A. S.; Ledley, T. S.; Lynds, S. E.; Haddad, N.; Ellins, K.

    2016-02-01

    EarthLabs, envisioned as a national model for high school Earth or Environmental Science lab courses, is adaptable for both undergraduate middle school students. The collection includes ten online modules that combine to feature a global view of our planet as a dynamic, interconnected system, by engaging learners in extended investigations. EarthLabs support state and national guidelines, including the NGSS, for science content. Four modules directly guide students to discover vital aspects of the oceans while five other modules incorporate ocean sciences in order to complete an understanding of Earth's climate system. Students gain a broad perspective on the key role oceans play in fishing industry, droughts, coral reefs, hurricanes, the carbon cycle, as well as life on land and in the seas to drive our changing climate by interacting with scientific research data, manipulating satellite imagery, numerical data, computer visualizations, experiments, and video tutorials. Students explore Earth system processes and build quantitative skills that enable them to objectively evaluate scientific findings for themselves as they move through ordered sequences that guide the learning. As a robust collection, EarthLabs modules engage students in extended, rigorous investigations allowing a deeper understanding of the ocean, climate and weather. This presentation provides an overview of the ten curriculum modules that comprise the EarthLabs collection developed by TERC and found at http://serc.carleton.edu/earthlabs/index.html. Evaluation data on the effectiveness and use in secondary education classrooms will be summarized.

  16. Effects of soil organic matter content on cadmium toxicity in Eisenia fetida: implications for the use of biomarkers and standard toxicity tests.

    Science.gov (United States)

    Irizar, A; Rodríguez, M P; Izquierdo, A; Cancio, I; Marigómez, I; Soto, M

    2015-01-01

    Bioavailability is affected by soil physicochemical characteristics such as pH and organic matter (OM) content. In addition, OM constitutes the energy source of Eisenia fetida, a well established model species for soil toxicity assessment. The present work aimed at assessing the effects of changes in OM content on the toxicity of Cd in E. fetida through the measurement of neutral red uptake (NRU) and mortality, growth, and reproduction (Organisation for Economic Co-operation and Development [OECD] Nos. 207 and 222). Complementarily, metallothionein (MT) and catalase transcription levels were measured. To decrease variability inherent to natural soils, artificial soils (Organization for Economic Cooperation and Development 1984) with different OM content (6, 10, and 14%) and spiked with Cd solutions at increasing concentrations were used. Low OM in soil decreased soil ingestion and Cd bioaccumulation but also increased Cd toxicity causing lower NRU of coelomocytes, 100 % mortality, and stronger reproduction impairment, probably due to the lack of energy to maintain protection mechanisms (production of MT).Cd bioaccumulation did not reflect toxicity, and OM played a pivotal role in Cd toxicity. Thus, OM content should be taken into account when using E. fetida in in vivo exposures for soil health assessment.

  17. Onset of rigor mortis is earlier in red muscle than in white muscle.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H

    2000-01-01

    Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.

  18. Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL

    Science.gov (United States)

    Jenkins, J. Steven; Rouquette, Nicolas F.

    2012-01-01

    The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.

  19. High and low rigor temperature effects on sheep meat tenderness and ageing.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (Prigor at each ageing time were significantly different (Prigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  20. Attrition during a randomized controlled trial of reduced nicotine content cigarettes as a proxy for understanding acceptability of nicotine product standards.

    Science.gov (United States)

    Mercincavage, Melissa; Wileyto, E Paul; Saddleson, Megan L; Lochbuehler, Kirsten; Donny, Eric C; Strasser, Andrew A

    2017-06-01

    To determine (1) if nicotine content affects study attrition-a potential behavioral measure of acceptability-in a trial that required compliance with three levels of reduced nicotine content (RNC) cigarettes, and (2) if attrition is associated with subjective and behavioral responses to RNC cigarettes. Secondary analysis of a 35-day, parallel-design, open-label, randomized controlled trial. After a 5-day baseline period, participants were randomized to smoke for three 10-day periods: their preferred brand (control group) or RNC cigarettes with three nicotine levels in a within-subject stepdown (one group: high-moderate-low) or non-stepdown (five groups: high-low-moderate, low-moderate-high, low-high-moderate, moderate-low-high, moderate-high-low) fashion. A single site in Philadelphia, PA, USA. A total of 246 non-treatment-seeking daily smokers [mean age = 39.52, cigarettes per day (CPD) = 20.95, 68.3% white] were recruited from October 2007 to June 2013. The primary outcome was attrition. Key predictors were nicotine content transition and study period. Exploratory predictors were taste and strength subjective ratings, total puff volume and carbon monoxide (CO) boost. Covariates included: age, gender, race, education and nicotine dependence. Overall attrition was 31.3% (n = 77): 24.1% of the control and 25.0% of the stepdown RNC cigarette groups dropped out versus 44.6% of non-stepdown groups (P = 0.006). Compared with controls, attrition odds were 4.5 and 4.7 times greater among smokers transitioning from preferred and the highest RNC cigarettes to the lowest RNC cigarettes, respectively (P = 0.001 and 0.003). Providing more favorable initial taste ratings of study cigarettes decreased attrition odds by 2% (P = 0.012). The majority of participants completed a 35-day trial of varying levels of reduced nicotine content cigarettes. Participant dropout was greater for cigarettes with lower nicotine content and less in smokers reporting more favorable

  1. Rigorous classification and carbon accounting principles for low and Zero Carbon Cities

    International Nuclear Information System (INIS)

    Kennedy, Scott; Sgouridis, Sgouris

    2011-01-01

    A large number of communities, new developments, and regions aim to lower their carbon footprint and aspire to become 'zero carbon' or 'Carbon Neutral.' Yet there are neither clear definitions for the scope of emissions that such a label would address on an urban scale, nor is there a process for qualifying the carbon reduction claims. This paper addresses the question of how to define a zero carbon, Low Carbon, or Carbon Neutral urban development by proposing hierarchical emissions categories with three levels: Internal Emissions based on the geographical boundary, external emissions directly caused by core municipal activities, and internal or external emissions due to non-core activities. Each level implies a different carbon management strategy (eliminating, balancing, and minimizing, respectively) needed to meet a Net Zero Carbon designation. The trade-offs, implications, and difficulties of implementing carbon debt accounting based upon these definitions are further analyzed. - Highlights: → A gap exists in comprehensive and standardized accounting methods for urban carbon emissions. → We propose a comprehensive and rigorous City Framework for Carbon Accounting (CiFCA). → CiFCA classifies emissions hierarchically with corresponding carbon management strategies. → Adoption of CiFCA allows for meaningful comparisons of claimed performance of eco-cities.

  2. Editorial: Advanced Learning Technologies, Performance Technologies, Open Contents, and Standards - Some Papers from the Best Papers of the Conference ICCE C3 2009

    Directory of Open Access Journals (Sweden)

    Fanny Klett (IEEE Fellow

    2010-09-01

    Full Text Available This special issue deals with several cutting edge research outcomes from recent advancement of learning technologies. Advanced learning technologies are the composition of various related technologies and concepts such as i internet technologies and mobile technologies, ii human and organizational performance/knowledge management, and iii underlying trends toward open technology, open content and open education. This editorial note describes the overview of these topics related to the advanced learning technologies to provide the common framework for the accepted papers in this special issue.

  3. Towards a New Basel Accord with More Rigorous Settlements

    Directory of Open Access Journals (Sweden)

    Petru PRUNEA

    2010-09-01

    Full Text Available The recent financial crisis made the banking sector more vulnerable to shocks. The system was characterised by weaknesses: too much leverage in the banking; not enough high quality capital to absorb losses and excessive credit growth based on underwriting standards and under pricing of liquidity. This article is about a new accord Basel III and the view of this framework. Basel III will be finalized before November 2010, and will be implemented by the end of 2012. Basel III is going to be implemented in the United States. All G–20 countries should adopt progressively this capital framework. The Basel Committee on Banking Supervision and national authorities should develop and agree a global framework for promoting a stronger liquidity in financial institutions. The reform program is to raise the resilience of the banking sector through promoting more sustainable growth, both in the near term and over the long therm. The initiatives of Basel Committee will develop a set of reforms based on four steps: public consultation, impact assessment, overall calibration and macroeconomic impact assessment over the transition period.

  4. Healthcare market research examined. Relevant, rigorous and highly regulated

    Directory of Open Access Journals (Sweden)

    Bob Douglas

    2011-10-01

    Full Text Available [The abstract of this article is not available. Here are the first sentences of the article. The full text is freely available upon registration]Market research is invariably confused with marketing – but, in fact, the two disciplines are very different. Put in its simplest terms, marketing is about promotion whilst market research is about understanding. Accordingly, data collected for market research purposes are used in a completely different way to that gathered for marketing, with research practices heavily regulated to ensure high ethical standards.Let’s begin with a definition of what, exactly, market research is. According to the ICC/ESOMAR International Code 2007 (a definition also adopted by the European Pharmaceutical Market Research Association, it is: «the systematic gathering and interpretation of information about individuals or organisations using the statistical and analytical methods and techniques of the applied social sciences to gain insight or support decision-making. The identity of respondents will not be revealed to the user of the information without explicit consent and no sales approach will be made to them as a direct result of their having provided information».

  5. Optimal full motion video registration with rigorous error propagation

    Science.gov (United States)

    Dolloff, John; Hottel, Bryant; Doucette, Peter; Theiss, Henry; Jocher, Glenn

    2014-06-01

    Optimal full motion video (FMV) registration is a crucial need for the Geospatial community. It is required for subsequent and optimal geopositioning with simultaneous and reliable accuracy prediction. An overall approach being developed for such registration is presented that models relevant error sources in terms of the expected magnitude and correlation of sensor errors. The corresponding estimator is selected based on the level of accuracy of the a priori information of the sensor's trajectory and attitude (pointing) information, in order to best deal with non-linearity effects. Estimator choices include near real-time Kalman Filters and batch Weighted Least Squares. Registration solves for corrections to the sensor a priori information for each frame. It also computes and makes available a posteriori accuracy information, i.e., the expected magnitude and correlation of sensor registration errors. Both the registered sensor data and its a posteriori accuracy information are then made available to "down-stream" Multi-Image Geopositioning (MIG) processes. An object of interest is then measured on the registered frames and a multi-image optimal solution, including reliable predicted solution accuracy, is then performed for the object's 3D coordinates. This paper also describes a robust approach to registration when a priori information of sensor attitude is unavailable. It makes use of structure-from-motion principles, but does not use standard Computer Vision techniques, such as estimation of the Essential Matrix which can be very sensitive to noise. The approach used instead is a novel, robust, direct search-based technique.

  6. Standard format and content for a licensee physical security plan for the protection of special nuclear material of moderate or low strategic significance - January 1980

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This guide describes the information required in the physical security plan submitted as part of an application for a license to possess, use, or transport special nuclear material (SNM) of moderate strategic significance or 10 kg or more of SNM of low strategic significance and recommends a standard format for presenting the information in an orderly arrangement. This standards format will thus serve as an aid to uniformity and completeness in the preparation and review of the physical protection plan of the license application. This document can also be used as guidance by licensees possessing or transporting less than 10 kg of SNM of low strategic significance in understanding the intent and implementing the requirements of paragraphs 73.67(a), 73.67(f), and 73.67(g) of 10 CRF Part 73

  7. Standard format and content for a licensee physical security plan for the protection of special nuclear material of moderate or low strategic significance (Revision 1, Feb. 1983)

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    This regulatory guide describes the information required in the physical security plan submitted as part of an application for a license to possess, use, or transport Special Nuclear Materials (SNM) of moderate strategic significance or 10 kg or more of SNM of low strategic significance and recommends a standard format for presenting the information in an orderly arrangement. This standard format will thus serve as an aid to uniformity and completeness in the preparation and review of the physical security plan of the license application. This document can also be used as guidance by licensees possessing or transporting less than 10 kg of SNM of low strategic significance in understanding the intent and implementing the requirements of paragraphs 73.67(a), 73.67(f), and 73.67(g) of 10 CFR Part 73

  8. From basic survival analytic theory to a non-standard application

    CERN Document Server

    Zimmermann, Georg

    2017-01-01

    Georg Zimmermann provides a mathematically rigorous treatment of basic survival analytic methods. His emphasis is also placed on various questions and problems, especially with regard to life expectancy calculations arising from a particular real-life dataset on patients with epilepsy. The author shows both the step-by-step analyses of that dataset and the theory the analyses are based on. He demonstrates that one may face serious and sometimes unexpected problems, even when conducting very basic analyses. Moreover, the reader learns that a practically relevant research question may look rather simple at first sight. Nevertheless, compared to standard textbooks, a more detailed account of the theory underlying life expectancy calculations is needed in order to provide a mathematically rigorous framework. Contents Regression Models for Survival Data Model Checking Procedures Life Expectancy Target Groups Researchers, lecturers, and students in the fields of mathematics and statistics Academics and experts work...

  9. Differential rigor development in red and white muscle revealed by simultaneous measurement of tension and stiffness.

    Science.gov (United States)

    Kobayashi, Masahiko; Takemori, Shigeru; Yamaguchi, Maki

    2004-02-10

    Based on the molecular mechanism of rigor mortis, we have proposed that stiffness (elastic modulus evaluated with tension response against minute length perturbations) can be a suitable index of post-mortem rigidity in skeletal muscle. To trace the developmental process of rigor mortis, we measured stiffness and tension in both red and white rat skeletal muscle kept in liquid paraffin at 37 and 25 degrees C. White muscle (in which type IIB fibres predominate) developed stiffness and tension significantly more slowly than red muscle, except for soleus red muscle at 25 degrees C, which showed disproportionately slow rigor development. In each of the examined muscles, stiffness and tension developed more slowly at 25 degrees C than at 37 degrees C. In each specimen, tension always reached its maximum level earlier than stiffness, and then decreased more rapidly and markedly than stiffness. These phenomena may account for the sequential progress of rigor mortis in human cadavers.

  10. Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).

    Science.gov (United States)

    Suzutani, T; Ishibashi, H; Takatori, T

    1978-11-01

    The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.

  11. 75 FR 29732 - Career and Technical Education Program-Promoting Rigorous Career and Technical Education Programs...

    Science.gov (United States)

    2010-05-27

    ... rigorous knowledge and skills in English- language arts and mathematics that employers and colleges expect... specialists and to access the student outcome data needed to meet annual evaluation and reporting requirements...

  12. Rigorous derivation from Landau-de Gennes theory to Ericksen-Leslie theory

    OpenAIRE

    Wang, Wei; Zhang, Pingwen; Zhang, Zhifei

    2013-01-01

    Starting from Beris-Edwards system for the liquid crystal, we present a rigorous derivation of Ericksen-Leslie system with general Ericksen stress and Leslie stress by using the Hilbert expansion method.

  13. Observed communication skills: how do they relate to the consultation content? A nation-wide study of graduate medical students seeing a standardized patient for a first-time consultation in a general practice setting

    Directory of Open Access Journals (Sweden)

    Holen Are

    2007-11-01

    Full Text Available Abstract Background In this study, we wanted to investigate the relationship between background variables, communication skills, and the bio-psychosocial content of a medical consultation in a general practice setting with a standardized patient. Methods Final-year medical school students (N = 111 carried out a consultation with an actor playing the role of a patient with a specific somatic complaint, psychosocial stressors, and concerns about cancer. Based on videotapes, communication skills and consultation content were scored separately. Results The mean level of overall communication skills had a significant impact upon the counts of psychosocial issues, the patient's concerns about cancer, and the information and planning parts of the consultation content being addressed. Gender and age had no influence upon the relationship between communication skills and consultation content. Conclusion Communication skills seem to be important for final-year students' competence in addressing sensitive psychosocial issues and patients' concerns as well as informing and planning with patients being representative for a fairly complex case in general practice. This result should be considered in the design and incorporation of communication skills training as part of the curriculum of medical schools.

  14. Lixiviation of polymer matrix parcels of nuclear wastes in an environment with a low water content with respect to the standard characterisation test

    International Nuclear Information System (INIS)

    Reynaud, Vincent

    1996-01-01

    It is generally admitted that, in a nuclear waste storage site, a possible return of radionuclides towards the biosphere would mainly occur by leaching of coated items and their transport by natural waters. Therefore, lixiviation properties of coated nuclear wastes are among the most important. The objective of this research thesis is therefore to compare the activity release of samples of ion exchange polymer coated by a polymer (epoxy or polyester) matrix. Two types of tests have been performed: a standard test (sample immersion in water) and a lysimeter test (simulation of the geological environment by means of glass balls). The lixiviation of tritium-containing water is studied after a 300 day long experiment. The modelling of the release of tritium-containing water by using Fick equations gives good results. Factors influencing the lixiviation of cobalt ions and caesium ions are studied, and the lixiviation of these both ions is then modelled [fr

  15. Reduction of CT beam hardening artefacts of ethylene vinyl alcohol copolymer by variation of the tantalum content: evaluation in a standardized aortic endoleak phantom

    International Nuclear Information System (INIS)

    Treitl, Karla M.; Scherr, Michael; Foerth, Monika; Braun, Franziska; Maxien, Daniel; Treitl, Marcus

    2015-01-01

    Our aim was to develop an aortic stent graft phantom to simulate endoleak treatment and to find a tantalum content (TC) of ethylene-vinyl-alcohol-copolymer that causes fewer computed tomography (CT) beam hardening artefacts, but still allows for fluoroscopic visualization. Ethylene-vinyl-alcohol-copolymer specimens of different TC (10-50 %, and 100 %) were injected in an aortic phantom bearing a stent graft and endoleak cavities with simulated re-perfusion. Fluoroscopic visibility of the ethylene-vinyl-alcohol-copolymer specimens was analyzed. In addition, six radiologists analyzed endoleak visibility, and artefact intensity of ethylene-vinyl-alcohol-copolymer in CT. Reduction of TC significantly decreased CT artefact intensity of ethylene-vinyl-alcohol-copolymer and increased visibility of endoleak re-perfusion (p < 0.000). It also significantly decreased fluoroscopic visibility of ethylene-vinyl-alcohol-copolymer (R = 0.883, p ≤ 0.01), and increased the active embolic volumes prior to visualization (Δ ≥ 40 μl). Ethylene-vinyl-alcohol-copolymer specimens with a TC of 45-50 % exhibited reasonable visibility, a low active embolic volume and a tolerable CT artefact intensity. The developed aortic stent graft phantom allows for a reproducible simulation of embolization of endoleaks. The data suggest a reduction of the TC of ethylene-vinyl-alcohol-copolymer to 45 -50 % of the original, to interfere less with diagnostic imaging in follow-up CT examinations, while still allowing for fluoroscopic visualization. (orig.)

  16. Assessing environmental vulnerability in EIA-The content and context of the vulnerability concept in an alternative approach to standard EIA procedure

    International Nuclear Information System (INIS)

    Kvaerner, Jens; Swensen, Grete; Erikstad, Lars

    2006-01-01

    In the traditional EIA procedure environmental vulnerability is only considered to a minor extent in the early stages when project alternatives are worked out. In Norway, an alternative approach to EIA, an integrated vulnerability model (IVM), emphasising environmental vulnerability and alternatives development in the early stages of EIA, has been tried out in a few pilot cases. This paper examines the content and use of the vulnerability concept in the IVM approach, and discusses the concept in an EIA context. The vulnerability concept is best suited to overview analyses and large scale spatial considerations. The concept is particularly useful in the early stages of EIA when alternatives are designed and screened. By introducing analyses of environmental vulnerability at the start of the EIA process, the environment can be a more decisive issue for the creation of project alternatives as well as improving the basis for scoping. Vulnerability and value aspects should be considered as separate dimensions. There is a need to operate with a specification between general and specific vulnerability. The concept of environmental vulnerability has proven useful in a wide range of disciplines. Different disciplines have different lengths of experience regarding vulnerability. In disciplines such as landscape planning and hydrogeology we find elements suitable as cornerstones in the further development of an interdisciplinary methodology. Further development of vulnerability criteria in different disciplines and increased public involvement in the early stages of EIA are recommended

  17. Changes in the contractile state, fine structure and metabolism of cardiac muscle cells during the development of rigor mortis.

    Science.gov (United States)

    Vanderwee, M A; Humphrey, S M; Gavin, J B; Armiger, L C

    1981-01-01

    Transmural slices from the left anterior papillary muscle of dog hearts were maintained for 120 min in a moist atmosphere at 37 degrees C. At 15-min intervals tissue samples were taken for estimation of adenosine triphosphate (ATP) and glucose-6-phosphate (G6P) and for electron microscopic examination. At the same time the deformability under standard load of comparable regions of an adjacent slice of tissue was measured. ATP levels fell rapidly during the first 45 to 75 min after excision of the heart. During a subsequent further decline in ATP, the mean deformability of myocardium fell from 30 to 12% indicating the development of rigor mortis. Conversely, G6P levels increased during the first decline in adenosine triphosphate but remained relatively steady thereafter. Whereas many of the myocardial cells fixed after 5 min contracted on contact with glutaraldehyde, all cells examined after 15 to 40 min were relaxed. A progressive increase in the proportion of contracted cells was observed during the rapid increase in myocardial rigidity. During this late contraction the cells showed morphological evidence of irreversible injury. These findings suggest that ischaemic myocytes contract just before actin and myosin become strongly linked to maintain the state of rigor mortis.

  18. Bounding Averages Rigorously Using Semidefinite Programming: Mean Moments of the Lorenz System

    Science.gov (United States)

    Goluskin, David

    2018-04-01

    We describe methods for proving bounds on infinite-time averages in differential dynamical systems. The methods rely on the construction of nonnegative polynomials with certain properties, similarly to the way nonlinear stability can be proved using Lyapunov functions. Nonnegativity is enforced by requiring the polynomials to be sums of squares, a condition which is then formulated as a semidefinite program (SDP) that can be solved computationally. Although such computations are subject to numerical error, we demonstrate two ways to obtain rigorous results: using interval arithmetic to control the error of an approximate SDP solution, and finding exact analytical solutions to relatively small SDPs. Previous formulations are extended to allow for bounds depending analytically on parametric variables. These methods are illustrated using the Lorenz equations, a system with three state variables ( x, y, z) and three parameters (β ,σ ,r). Bounds are reported for infinite-time averages of all eighteen moments x^ly^mz^n up to quartic degree that are symmetric under (x,y)\\mapsto (-x,-y). These bounds apply to all solutions regardless of stability, including chaotic trajectories, periodic orbits, and equilibrium points. The analytical approach yields two novel bounds that are sharp: the mean of z^3 can be no larger than its value of (r-1)^3 at the nonzero equilibria, and the mean of xy^3 must be nonnegative. The interval arithmetic approach is applied at the standard chaotic parameters to bound eleven average moments that all appear to be maximized on the shortest periodic orbit. Our best upper bound on each such average exceeds its value on the maximizing orbit by less than 1%. Many bounds reported here are much tighter than would be possible without computer assistance.

  19. Double-standards in reporting of risk and responsibility for sexual health: a qualitative content analysis of negatively toned UK newsprint articles.

    Science.gov (United States)

    Martin, Susan P; McDaid, Lisa M; Hilton, Shona

    2014-08-04

    The need to challenge messages that reinforce harmful negative discourses around sexual risk and responsibility is a priority in improving sexual health. The mass media are an important source of information regularly alerting, updating and influencing public opinions and the way in which sexual health issues are framed may play a crucial role in shaping expectations of who is responsible for sexual health risks and healthy sexual practices. We conducted an in-depth, qualitative analysis of 85 negatively toned newspaper articles reporting on sexual health topics to examine how risk and responsibility have been framed within these in relation to gender. Articles published in 2010 in seven UK and three Scottish national newspapers were included. A latent content analysis approach was taken, focusing on interpreting the underlying meaning of text. A key theme in the articles was men being framed as a risk to women's sexual health, whilst it was part of a women's role to "resist" men's advances. Such discourses tended to portray a power imbalance in sexual relationships between women and men. A number of articles argued that it was women who needed to take more responsibility for sexual health. Articles repeatedly suggested that women and teenage girls in particular, lacked the skills and confidence to negotiate safer sex and sex education programmes were often presented as having failed. Men were frequently portrayed as being more promiscuous and engaging in more risky sexual health behaviours than women, yet just one article drew attention to the lack of focus on male responsibility for sexual health. Gay men were used as a bench mark against which rates were measured and framed as being a risk and at risk. The framing of men as a risk to women, whilst women are presented at the same time as responsible for patrolling sexual encounters, organising contraception and preventing sexual ill health reinforces gender stereotypes and undermines efforts to promote a

  20. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    Science.gov (United States)

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  1. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    Science.gov (United States)

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation.

    Science.gov (United States)

    Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A

    2015-01-08

    science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.

  3. The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.

    Science.gov (United States)

    Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko

    2013-11-01

    Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Quality properties of pre- and post-rigor beef muscle after interventions with high frequency ultrasound.

    Science.gov (United States)

    Sikes, Anita L; Mawson, Raymond; Stark, Janet; Warner, Robyn

    2014-11-01

    The delivery of a consistent quality product to the consumer is vitally important for the food industry. The aim of this study was to investigate the potential for using high frequency ultrasound applied to pre- and post-rigor beef muscle on the metabolism and subsequent quality. High frequency ultrasound (600kHz at 48kPa and 65kPa acoustic pressure) applied to post-rigor beef striploin steaks resulted in no significant effect on the texture (peak force value) of cooked steaks as measured by a Tenderometer. There was no added benefit of ultrasound treatment above that of the normal ageing process after ageing of the steaks for 7days at 4°C. Ultrasound treatment of post-rigor beef steaks resulted in a darkening of fresh steaks but after ageing for 7days at 4°C, the ultrasound-treated steaks were similar in colour to that of the aged, untreated steaks. High frequency ultrasound (2MHz at 48kPa acoustic pressure) applied to pre-rigor beef neck muscle had no effect on the pH, but the calculated exhaustion factor suggested that there was some effect on metabolism and actin-myosin interaction. However, the resultant texture of cooked, ultrasound-treated muscle was lower in tenderness compared to the control sample. After ageing for 3weeks at 0°C, the ultrasound-treated samples had the same peak force value as the control. High frequency ultrasound had no significant effect on the colour parameters of pre-rigor beef neck muscle. This proof-of-concept study showed no effect of ultrasound on quality but did indicate that the application of high frequency ultrasound to pre-rigor beef muscle shows potential for modifying ATP turnover and further investigation is warranted. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  5. Electrocardiogram artifact caused by rigors mimicking narrow complex tachycardia: a case report.

    Science.gov (United States)

    Matthias, Anne Thushara; Indrakumar, Jegarajah

    2014-02-04

    The electrocardiogram (ECG) is useful in the diagnosis of cardiac and non-cardiac conditions. Rigors due to shivering can cause electrocardiogram artifacts mimicking various cardiac rhythm abnormalities. We describe an 80-year-old Sri Lankan man with an abnormal electrocardiogram mimicking narrow complex tachycardia during the immediate post-operative period. Electrocardiogram changes caused by muscle tremor during rigors could mimic a narrow complex tachycardia. Identification of muscle tremor as a cause of electrocardiogram artifact can avoid unnecessary pharmacological and non-pharmacological intervention to prevent arrhythmias.

  6. Increased scientific rigor will improve reliability of research and effectiveness of management

    Science.gov (United States)

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and

  7. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series

    Science.gov (United States)

    Liang, X. S.

    2017-12-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming

  8. Experimental evaluation of rigor mortis. VIII. Estimation of time since death by repeated measurements of the intensity of rigor mortis on rats.

    Science.gov (United States)

    Krompecher, T

    1994-10-21

    The development of the intensity of rigor mortis was monitored in nine groups of rats. The measurements were initiated after 2, 4, 5, 6, 8, 12, 15, 24, and 48 h post mortem (p.m.) and lasted 5-9 h, which ideally should correspond to the usual procedure after the discovery of a corpse. The experiments were carried out at an ambient temperature of 24 degrees C. Measurements initiated early after death resulted in curves with a rising portion, a plateau, and a descending slope. Delaying the initial measurement translated into shorter rising portions, and curves initiated 8 h p.m. or later are comprised of a plateau and/or a downward slope only. Three different phases were observed suggesting simple rules that can help estimate the time since death: (1) if an increase in intensity was found, the initial measurements were conducted not later than 5 h p.m.; (2) if only a decrease in intensity was observed, the initial measurements were conducted not earlier than 7 h p.m.; and (3) at 24 h p.m., the resolution is complete, and no further changes in intensity should occur. Our results clearly demonstrate that repeated measurements of the intensity of rigor mortis allow a more accurate estimation of the time since death of the experimental animals than the single measurement method used earlier. A critical review of the literature on the estimation of time since death on the basis of objective measurements of the intensity of rigor mortis is also presented.

  9. Standards, Assessments & Opting Out, Spring 2015

    Science.gov (United States)

    Advance Illinois, 2015

    2015-01-01

    In the spring, Illinois students will take new state assessments that reflect the rigor and relevance of the new Illinois Learning Standards. But some classmates will sit out and join the pushback against standardized testing. Opt-out advocates raise concerns about over-testing, and the resulting toll on students as well as the impact on classroom…

  10. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    Science.gov (United States)

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  11. In rats fed high-energy diets, taste, rather than fat content, is the key factor increasing food intake: a comparison of a cafeteria and a lipid-supplemented standard diet

    Directory of Open Access Journals (Sweden)

    Laia Oliva

    2017-09-01

    Full Text Available Background Food selection and ingestion both in humans and rodents, often is a critical factor in determining excess energy intake and its related disorders. Methods Two different concepts of high-fat diets were tested for their obesogenic effects in rats; in both cases, lipids constituted about 40% of their energy intake. The main difference with controls fed standard lab chow, was, precisely, the lipid content. Cafeteria diets (K were self-selected diets devised to be desirable to the rats, mainly because of its diverse mix of tastes, particularly salty and sweet. This diet was compared with another, more classical high-fat (HF diet, devised not to be as tasty as K, and prepared by supplementing standard chow pellets with fat. We also analysed the influence of sex on the effects of the diets. Results K rats grew faster because of a high lipid, sugar and protein intake, especially the males, while females showed lower weight but higher proportion of body lipid. In contrast, the weight of HF groups were not different from controls. Individual nutrient’s intake were analysed, and we found that K rats ingested large amounts of both disaccharides and salt, with scant differences of other nutrients’ proportion between the three groups. The results suggest that the key differential factor of the diet eliciting excess energy intake was the massive presence of sweet and salty tasting food. Conclusions The significant presence of sugar and salt appears as a powerful inducer of excess food intake, more effective than a simple (albeit large increase in the diet’s lipid content. These effects appeared already after a relatively short treatment. The differential effects of sex agree with their different hedonic and obesogenic response to diet.

  12. Rigorous lower bound on the dynamic critical exponent of some multilevel Swendsen-Wang algorithms

    International Nuclear Information System (INIS)

    Li, X.; Sokal, A.D.

    1991-01-01

    We prove the rigorous lower bound z exp ≥α/ν for the dynamic critical exponent of a broad class of multilevel (or ''multigrid'') variants of the Swendsen-Wang algorithm. This proves that such algorithms do suffer from critical slowing down. We conjecture that such algorithms in fact lie in the same dynamic universality class as the stanard Swendsen-Wang algorithm

  13. Rigorous approximation of stationary measures and convergence to equilibrium for iterated function systems

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Monge, Maurizio; Nisoli, Isaia

    2016-01-01

    We study the problem of the rigorous computation of the stationary measure and of the rate of convergence to equilibrium of an iterated function system described by a stochastic mixture of two or more dynamical systems that are either all uniformly expanding on the interval, either all contracting. In the expanding case, the associated transfer operators satisfy a Lasota–Yorke inequality, we show how to compute a rigorous approximations of the stationary measure in the L "1 norm and an estimate for the rate of convergence. The rigorous computation requires a computer-aided proof of the contraction of the transfer operators for the maps, and we show that this property propagates to the transfer operators of the IFS. In the contracting case we perform a rigorous approximation of the stationary measure in the Wasserstein–Kantorovich distance and rate of convergence, using the same functional analytic approach. We show that a finite computation can produce a realistic computation of all contraction rates for the whole parameter space. We conclude with a description of the implementation and numerical experiments. (paper)

  14. Double phosphorylation of the myosin regulatory light chain during rigor mortis of bovine Longissimus muscle.

    Science.gov (United States)

    Muroya, Susumu; Ohnishi-Kameyama, Mayumi; Oe, Mika; Nakajima, Ikuyo; Shibata, Masahiro; Chikuni, Koichi

    2007-05-16

    To investigate changes in myosin light chains (MyLCs) during postmortem aging of the bovine longissimus muscle, we performed two-dimensional gel electrophoresis followed by identification with matrix-assisted laser desorption ionization time-of-flight mass spectrometry. The results of fluorescent differential gel electrophoresis showed that two spots of the myosin regulatory light chain (MyLC2) at pI values of 4.6 and 4.7 shifted toward those at pI values of 4.5 and 4.6, respectively, by 24 h postmortem when rigor mortis was completed. Meanwhile, the MyLC1 and MyLC3 spots did not change during the 14 days postmortem. Phosphoprotein-specific staining of the gels demonstrated that the MyLC2 proteins at pI values of 4.5 and 4.6 were phosphorylated. Furthermore, possible N-terminal region peptides containing one and two phosphoserine residues were detected in each mass spectrum of the MyLC2 spots at pI values of 4.5 and 4.6, respectively. These results demonstrated that MyLC2 became doubly phosphorylated during rigor formation of the bovine longissimus, suggesting involvement of the MyLC2 phosphorylation in the progress of beef rigor mortis. Bovine; myosin regulatory light chain (RLC, MyLC2); phosphorylation; rigor mortis; skeletal muscle.

  15. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    Energy Technology Data Exchange (ETDEWEB)

    Botelho, Luiz C.L. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Matematica Aplicada]. E-mail: botelho.luiz@superig.com.br

    2008-07-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R{sup {infinity}}, we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  16. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    International Nuclear Information System (INIS)

    Botelho, Luiz C.L.

    2008-01-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R ∞ , we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  17. Beyond the RCT: Integrating Rigor and Relevance to Evaluate the Outcomes of Domestic Violence Programs

    Science.gov (United States)

    Goodman, Lisa A.; Epstein, Deborah; Sullivan, Cris M.

    2018-01-01

    Programs for domestic violence (DV) victims and their families have grown exponentially over the last four decades. The evidence demonstrating the extent of their effectiveness, however, often has been criticized as stemming from studies lacking scientific rigor. A core reason for this critique is the widespread belief that credible evidence can…

  18. A plea for rigorous conceptual analysis as central method in transnational law design

    NARCIS (Netherlands)

    Rijgersberg, R.; van der Kaaij, H.

    2013-01-01

    Although shared problems are generally easily identified in transnational law design, it is considerably more difficult to design frameworks that transcend the peculiarities of local law in a univocal fashion. The following exposition is a plea for giving more prominence to rigorous conceptual

  19. College Readiness in California: A Look at Rigorous High School Course-Taking

    Science.gov (United States)

    Gao, Niu

    2016-01-01

    Recognizing the educational and economic benefits of a college degree, education policymakers at the federal, state, and local levels have made college preparation a priority. There are many ways to measure college readiness, but one key component is rigorous high school coursework. California has not yet adopted a statewide college readiness…

  20. A comparative study of 129I content in environmental standard materials IAEA-375, NIST SRM 4354 and NIST SRM 4357 by Thermal Ionization Mass Spectrometry and Accelerator Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Olson, John; Adamic, Mary; Snyder, Darin; Brookhart, Jacob; Hahn, Paula; Watrous, Matthew

    2016-11-01

    Iodine environmental measurements have consistently been backed up in the literature by standard materials like IAEA-375, Chernobyl Soil. There are not many other sources of a certified reference material for 129I content for mass spectrometry measurements. Some that have been found in the literature include NIST-4354 and NIST-4357. They are still available at the time of this writing. They don’t have certified content or isotopic values. There has been some work in the literature to show that iodine is present, but there hasn’t been enough to establish a consensus value. These materials have been analyzed at INL through two separate mass spectrometry techniques. They involve a combustion method of the starting material in oxygen, followed by TIMS analysis and a leaching preparation analyzed by accelerator mass spectrometry. Combustion/TIMS preparation of NIST SRM-4354 resulted in a 129I/127I ratio of 1.92 x 10-6 which agrees with AMS measurements which measured the 129I/127I ratio to be 1.93 x 10-6.

  1. Guidelines for data content standards for Africa.

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2008-04-01

    Full Text Available Wastewater pump point Reservoir Storm sewer reservoir point Water Tower Water reservoir area Water tank site Wastewater septic tank point Water Supply Point Water meter point Treatment Plant Industrial waste treatment plant area Wastewater treatment plant.../Cable coupling Pump Station Fuel pump point Fuel source point Fuel tank site Water pump point Water pump station site Industrial waste pump station ejector point Natural gas pump station site Storm sewer pump station site Wastewater pump ejector station site...

  2. Guidelines for data content standards for Africa

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2005-04-01

    Full Text Available pump station site Storm sewer pump station site Wastewater pump ejector station site Wastewater pump point Reservoir Storm sewer reservoir point Water Tower Water reservoir area Water tank site Wastewater septic tank point Water Supply Point... Compressed air pipe line Overhead Pipeline/Cable pipeline Submerged Pipeline/Cable coupling Pump Station Fuel pump point Fuel source point Fuel tank site Water pump point Water pump station site Industrial waste pump station ejector point Natural gas...

  3. Evaluation of physical dimension changes as nondestructive measurements for monitoring rigor mortis development in broiler muscles.

    Science.gov (United States)

    Cavitt, L C; Sams, A R

    2003-07-01

    Studies were conducted to develop a non-destructive method for monitoring the rate of rigor mortis development in poultry and to evaluate the effectiveness of electrical stimulation (ES). In the first study, 36 male broilers in each of two trials were processed at 7 wk of age. After being bled, half of the birds received electrical stimulation (400 to 450 V, 400 to 450 mA, for seven pulses of 2 s on and 1 s off), and the other half were designated as controls. At 0.25 and 1.5 h postmortem (PM), carcasses were evaluated for the angles of the shoulder, elbow, and wing tip and the distance between the elbows. Breast fillets were harvested at 1.5 h PM (after chilling) from all carcasses. Fillet samples were excised and frozen for later measurement of pH and R-value, and the remainder of each fillet was held on ice until 24 h postmortem. Shear value and pH means were significantly lower, but R-value means were higher (P rigor mortis by ES. The physical dimensions of the shoulder and elbow changed (P rigor mortis development and with ES. These results indicate that physical measurements of the wings maybe useful as a nondestructive indicator of rigor development and for monitoring the effectiveness of ES. In the second study, 60 male broilers in each of two trials were processed at 7 wk of age. At 0.25, 1.5, 3.0, and 6.0 h PM, carcasses were evaluated for the distance between the elbows. At each time point, breast fillets were harvested from each carcass. Fillet samples were excised and frozen for later measurement of pH and sacromere length, whereas the remainder of each fillet was held on ice until 24 h PM. Shear value and pH means (P rigor mortis development. Elbow distance decreased (P rigor development and was correlated (P rigor mortis development in broiler carcasses.

  4. 77 FR 53224 - Coastal and Marine Ecological Classification Standard

    Science.gov (United States)

    2012-08-31

    ... develop and test the standard. CMECS has been applied in projects in a variety of geographies. A rigorous... components allows users to apply CMECS to the scale and specificity that best suits their needs. Modifiers...

  5. NASA Goddard Space Flight Center presents Enhancing Standards Based Science Curriculum through NASA Content Relevancy: A Model for Sustainable Teaching-Research Integration Dr. Robert Gabrys, Raquel Marshall, Dr. Evelina Felicite-Maurice, Erin McKinley

    Science.gov (United States)

    Marshall, R. H.; Gabrys, R.

    2016-12-01

    NASA Goddard Space Flight Center has developed a systemic educator professional development model for the integration of NASA climate change resources into the K-12 classroom. The desired outcome of this model is to prepare teachers in STEM disciplines to be globally engaged and knowledgeable of current climate change research and its potential for content relevancy alignment to standard-based curriculum. The application and mapping of the model is based on the state education needs assessment, alignment to the Next Generation Science Standards (NGSS), and implementation framework developed by the consortium of district superintendents and their science supervisors. In this presentation, we will demonstrate best practices for extending the concept of inquiry-based and project-based learning through the integration of current NASA climate change research into curriculum unit lessons. This model includes a significant teacher development component focused on capacity development for teacher instruction and pedagogy aimed at aligning NASA climate change research to related NGSS student performance expectations and subsequent Crosscutting Concepts, Science and Engineering Practices, and Disciplinary Core Ideas, a need that was presented by the district steering committee as critical for ensuring sustainability and high-impact in the classroom. This model offers a collaborative and inclusive learning community that connects classroom teachers to NASA climate change researchers via an ongoing consultant/mentoring approach. As a result of the first year of implementation of this model, Maryland teachers are implementing NGSS unit lessons that guide students in open-ended research based on current NASA climate change research.

  6. The Challenge of Timely, Responsive and Rigorous Ethics Review of Disaster Research: Views of Research Ethics Committee Members.

    Directory of Open Access Journals (Sweden)

    Matthew Hunt

    Full Text Available Research conducted following natural disasters such as earthquakes, floods or hurricanes is crucial for improving relief interventions. Such research, however, poses ethical, methodological and logistical challenges for researchers. Oversight of disaster research also poses challenges for research ethics committees (RECs, in part due to the rapid turnaround needed to initiate research after a disaster. Currently, there is limited knowledge available about how RECs respond to and appraise disaster research. To address this knowledge gap, we investigated the experiences of REC members who had reviewed disaster research conducted in low- or middle-income countries.We used interpretive description methodology and conducted in-depth interviews with 15 respondents. Respondents were chairs, members, advisors, or coordinators from 13 RECs, including RECs affiliated with universities, governments, international organizations, a for-profit REC, and an ad hoc committee established during a disaster. Interviews were analyzed inductively using constant comparative techniques.Through this process, three elements were identified as characterizing effective and high-quality review: timeliness, responsiveness and rigorousness. To ensure timeliness, many RECs rely on adaptations of review procedures for urgent protocols. Respondents emphasized that responsive review requires awareness of and sensitivity to the particularities of disaster settings and disaster research. Rigorous review was linked with providing careful assessment of ethical considerations related to the research, as well as ensuring independence of the review process.Both the frequency of disasters and the conduct of disaster research are on the rise. Ensuring effective and high quality review of disaster research is crucial, yet challenges, including time pressures for urgent protocols, exist for achieving this goal. Adapting standard REC procedures may be necessary. However, steps should be

  7. [A new formula for the measurement of rigor mortis: the determination of the FRR-index (author's transl)].

    Science.gov (United States)

    Forster, B; Ropohl, D; Raule, P

    1977-07-05

    The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.

  8. Measurements of the degree of development of rigor mortis as an indicator of stress in slaughtered pigs.

    Science.gov (United States)

    Warriss, P D; Brown, S N; Knowles, T G

    2003-12-13

    The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.

  9. Local Content

    CSIR Research Space (South Africa)

    Gibberd, Jeremy

    2016-10-01

    Full Text Available Local content refers to materials and products made in a country as opposed those that are imported. There is an increasing interest in the concept of local content as a means of supporting local economies and providing jobs (Belderbos & Sleuwaegen...

  10. Use of the Rigor Mortis Process as a Tool for Better Understanding of Skeletal Muscle Physiology: Effect of the Ante-Mortem Stress on the Progression of Rigor Mortis in Brook Charr (Salvelinus fontinalis).

    Science.gov (United States)

    Diouf, Boucar; Rioux, Pierre

    1999-01-01

    Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…

  11. A rigorous pole representation of multilevel cross sections and its practical applications

    International Nuclear Information System (INIS)

    Hwang, R.N.

    1987-01-01

    In this article a rigorous method for representing the multilevel cross sections and its practical applications are described. It is a generalization of the rationale suggested by de Saussure and Perez for the s-wave resonances. A computer code WHOPPER has been developed to convert the Reich-Moore parameters into the pole and residue parameters in momentum space. Sample calculations have been carried out to illustrate that the proposed method preserves the rigor of the Reich-Moore cross sections exactly. An analytical method has been developed to evaluate the pertinent Doppler-broadened line shape functions. A discussion is presented on how to minimize the number of pole parameters so that the existing reactor codes can be best utilized

  12. Rigorous simulations of a helical core fiber by the use of transformation optics formalism.

    Science.gov (United States)

    Napiorkowski, Maciej; Urbanczyk, Waclaw

    2014-09-22

    We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.

  13. Estimation of the time since death--reconsidering the re-establishment of rigor mortis.

    Science.gov (United States)

    Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter

    2013-01-01

    In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.

  14. Quality of nuchal translucency measurements correlates with broader aspects of program rigor and culture of excellence.

    Science.gov (United States)

    Evans, Mark I; Krantz, David A; Hallahan, Terrence; Sherwin, John; Britt, David W

    2013-01-01

    To determine if nuchal translucency (NT) quality correlates with the extent to which clinics vary in rigor and quality control. We correlated NT performance quality (bias and precision) of 246,000 patients with two alternative measures of clinic culture - % of cases for whom nasal bone (NB) measurements were performed and % of requisitions correctly filled for race-ethnicity and weight. When requisition errors occurred in 5% (33%), the curve lowered to 0.93 MoM (p 90%, MoM was 0.99 compared to those quality exists independent of individual variation in NT quality, and two divergent indices of program rigor are associated with NT quality. Quality control must be program wide, and to effect continued improvement in the quality of NT results across time, the cultures of clinics must become a target for intervention. Copyright © 2013 S. Karger AG, Basel.

  15. A Framework for Rigorously Identifying Research Gaps in Qualitative Literature Reviews

    DEFF Research Database (Denmark)

    Müller-Bloch, Christoph; Kranz, Johann

    2015-01-01

    Identifying research gaps is a fundamental goal of literature reviewing. While it is widely acknowledged that literature reviews should identify research gaps, there are no methodological guidelines for how to identify research gaps in qualitative literature reviews ensuring rigor and replicability....... Our study addresses this gap and proposes a framework that should help scholars in this endeavor without stifling creativity. To develop the framework we thoroughly analyze the state-of-the-art procedure of identifying research gaps in 40 recent literature reviews using a grounded theory approach....... Based on the data, we subsequently derive a framework for identifying research gaps in qualitative literature reviews and demonstrate its application with an example. Our results provide a modus operandi for identifying research gaps, thus enabling scholars to conduct literature reviews more rigorously...

  16. Derivation of basic equations for rigorous dynamic simulation of cryogenic distillation column for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Kinoshita, Masahiro; Naruse, Yuji

    1981-08-01

    The basic equations are derived for rigorous dynamic simulation of cryogenic distillation columns for hydrogen isotope separation. The model accounts for such factors as differences in latent heat of vaporization among the six isotopic species of molecular hydrogen, decay heat of tritium, heat transfer through the column wall and nonideality of the solutions. Provision is also made for simulation of columns with multiple feeds and multiple sidestreams. (author)

  17. Rigor mortis development in turkey breast muscle and the effect of electrical stunning.

    Science.gov (United States)

    Alvarado, C Z; Sams, A R

    2000-11-01

    Rigor mortis development in turkey breast muscle and the effect of electrical stunning on this process are not well characterized. Some electrical stunning procedures have been known to inhibit postmortem (PM) biochemical reactions, thereby delaying the onset of rigor mortis in broilers. Therefore, this study was designed to characterize rigor mortis development in stunned and unstunned turkeys. A total of 154 turkey toms in two trials were conventionally processed at 20 to 22 wk of age. Turkeys were either stunned with a pulsed direct current (500 Hz, 50% duty cycle) at 35 mA (40 V) in a saline bath for 12 seconds or left unstunned as controls. At 15 min and 1, 2, 4, 8, 12, and 24 h PM, pectoralis samples were collected to determine pH, R-value, L* value, sarcomere length, and shear value. In Trial 1, the samples obtained for pH, R-value, and sarcomere length were divided into surface and interior samples. There were no significant differences between the surface and interior samples among any parameters measured. Muscle pH significantly decreased over time in stunned and unstunned birds through 2 h PM. The R-values increased to 8 h PM in unstunned birds and 24 h PM in stunned birds. The L* values increased over time, with no significant differences after 1 h PM for the controls and 2 h PM for the stunned birds. Sarcomere length increased through 2 h PM in the controls and 12 h PM in the stunned fillets. Cooked meat shear values decreased through the 1 h PM deboning time in the control fillets and 2 h PM in the stunned fillets. These results suggest that stunning delayed the development of rigor mortis through 2 h PM, but had no significant effect on the measured parameters at later time points, and that deboning turkey breasts at 2 h PM or later will not significantly impair meat tenderness.

  18. Learning from Science and Sport - How we, Safety, "Engage with Rigor"

    Science.gov (United States)

    Herd, A.

    2012-01-01

    As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a

  19. Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.

    Science.gov (United States)

    Plant, D R; Lynch, G S

    2001-09-01

    1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.

  20. Rigorous Integration of Non-Linear Ordinary Differential Equations in Chebyshev Basis

    Czech Academy of Sciences Publication Activity Database

    Dzetkulič, Tomáš

    2015-01-01

    Roč. 69, č. 1 (2015), s. 183-205 ISSN 1017-1398 R&D Projects: GA MŠk OC10048; GA ČR GD201/09/H057 Institutional research plan: CEZ:AV0Z10300504 Keywords : Initial value problem * Rigorous integration * Taylor model * Chebyshev basis Subject RIV: IN - Informatics, Computer Science Impact factor: 1.366, year: 2015

  1. A rigorous proof of the Landau-Peierls formula and much more

    DEFF Research Database (Denmark)

    Briet, Philippe; Cornean, Horia; Savoie, Baptiste

    2012-01-01

    We present a rigorous mathematical treatment of the zero-field orbital magnetic susceptibility of a non-interacting Bloch electron gas, at fixed temperature and density, for both metals and semiconductors/insulators. In particular, we obtain the Landau-Peierls formula in the low temperature and d...... and density limit as conjectured by Kjeldaas and Kohn (Phys Rev 105:806–813, 1957)....

  2. Association Between Maximal Skin Dose and Breast Brachytherapy Outcome: A Proposal for More Rigorous Dosimetric Constraints

    International Nuclear Information System (INIS)

    Cuttino, Laurie W.; Heffernan, Jill; Vera, Robyn; Rosu, Mihaela; Ramakrishnan, V. Ramesh; Arthur, Douglas W.

    2011-01-01

    Purpose: Multiple investigations have used the skin distance as a surrogate for the skin dose and have shown that distances 4.05 Gy/fraction. Conclusion: The initial skin dose recommendations have been based on safe use and the avoidance of significant toxicity. The results from the present study have suggested that patients might further benefit if more rigorous constraints were applied and if the skin dose were limited to 120% of the prescription dose.

  3. Re-establishment of rigor mortis: evidence for a considerably longer post-mortem time span.

    Science.gov (United States)

    Crostack, Chiara; Sehner, Susanne; Raupach, Tobias; Anders, Sven

    2017-07-01

    Re-establishment of rigor mortis following mechanical loosening is used as part of the complex method for the forensic estimation of the time since death in human bodies and has formerly been reported to occur up to 8-12 h post-mortem (hpm). We recently described our observation of the phenomenon in up to 19 hpm in cases with in-hospital death. Due to the case selection (preceding illness, immobilisation), transfer of these results to forensic cases might be limited. We therefore examined 67 out-of-hospital cases of sudden death with known time points of death. Re-establishment of rigor mortis was positive in 52.2% of cases and was observed up to 20 hpm. In contrast to the current doctrine that a recurrence of rigor mortis is always of a lesser degree than its first manifestation in a given patient, muscular rigidity at re-establishment equalled or even exceeded the degree observed before dissolving in 21 joints. Furthermore, this is the first study to describe that the phenomenon appears to be independent of body or ambient temperature.

  4. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    Science.gov (United States)

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  5. Application of the rigorous method to x-ray and neutron beam scattering on rough surfaces

    International Nuclear Information System (INIS)

    Goray, Leonid I.

    2010-01-01

    The paper presents a comprehensive numerical analysis of x-ray and neutron scattering from finite-conducting rough surfaces which is performed in the frame of the boundary integral equation method in a rigorous formulation for high ratios of characteristic dimension to wavelength. The single integral equation obtained involves boundary integrals of the single and double layer potentials. A more general treatment of the energy conservation law applicable to absorption gratings and rough mirrors is considered. In order to compute the scattering intensity of rough surfaces using the forward electromagnetic solver, Monte Carlo simulation is employed to average the deterministic diffraction grating efficiency due to individual surfaces over an ensemble of realizations. Some rules appropriate for numerical implementation of the theory at small wavelength-to-period ratios are presented. The difference between the rigorous approach and approximations can be clearly seen in specular reflectances of Au mirrors with different roughness parameters at wavelengths where grazing incidence occurs at close to or larger than the critical angle. This difference may give rise to wrong estimates of rms roughness and correlation length if they are obtained by comparing experimental data with calculations. Besides, the rigorous approach permits taking into account any known roughness statistics and allows exact computation of diffuse scattering.

  6. A Draft Conceptual Framework of Relevant Theories to Inform Future Rigorous Research on Student Service-Learning Outcomes

    Science.gov (United States)

    Whitley, Meredith A.

    2014-01-01

    While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…

  7. Feedback for relatedness and competence : Can feedback in blended learning contribute to optimal rigor, basic needs, and motivation?

    NARCIS (Netherlands)

    Bombaerts, G.; Nickel, P.J.

    2017-01-01

    We inquire how peer and tutor feedback influences students' optimal rigor, basic needs and motivation. We analyze questionnaires from two courses in two subsequent years. We conclude that feedback in blended learning can contribute to rigor and basic needs, but it is not clear from our data what

  8. A rigorous mechanistic model for predicting gas hydrate formation kinetics: The case of CO2 recovery and sequestration

    International Nuclear Information System (INIS)

    ZareNezhad, Bahman; Mottahedin, Mona

    2012-01-01

    Highlights: ► A mechanistic model for predicting gas hydrate formation kinetics is presented. ► A secondary nucleation rate model is proposed for the first time. ► Crystal–crystal collisions and crystal–impeller collisions are distinguished. ► Simultaneous determination of nucleation and growth kinetics are established. ► Important for design of gas hydrate based energy storage and CO 2 recovery systems. - Abstract: A rigorous mechanistic model for predicting gas hydrate formation crystallization kinetics is presented and the special case of CO 2 gas hydrate formation regarding CO 2 recovery and sequestration processes has been investigated by using the proposed model. A physical model for prediction of secondary nucleation rate is proposed for the first time and the formation rates of secondary nuclei by crystal–crystal collisions and crystal–impeller collisions are formulated. The objective functions for simultaneous determination of nucleation and growth kinetics are presented and a theoretical framework for predicting the dynamic behavior of gas hydrate formation is presented. Predicted time variations of CO 2 content, total number and surface area of produced hydrate crystals are in good agreement with the available experimental data. The proposed approach can have considerable application for design of gas hydrate converters regarding energy storage and CO 2 recovery processes.

  9. Optimal correction and design parameter search by modern methods of rigorous global optimization

    International Nuclear Information System (INIS)

    Makino, K.; Berz, M.

    2011-01-01

    Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle

  10. Testing the Standard Model

    CERN Document Server

    Riles, K

    1998-01-01

    The Large Electron Project (LEP) accelerator near Geneva, more than any other instrument, has rigorously tested the predictions of the Standard Model of elementary particles. LEP measurements have probed the theory from many different directions and, so far, the Standard Model has prevailed. The rigour of these tests has allowed LEP physicists to determine unequivocally the number of fundamental 'generations' of elementary particles. These tests also allowed physicists to ascertain the mass of the top quark in advance of its discovery. Recent increases in the accelerator's energy allow new measurements to be undertaken, measurements that may uncover directly or indirectly the long-sought Higgs particle, believed to impart mass to all other particles.

  11. [Preliminarily application of content analysis to qualitative nursing data].

    Science.gov (United States)

    Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang

    2012-10-01

    Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.

  12. Unmet Need: Improving mHealth Evaluation Rigor to Build the Evidence Base.

    Science.gov (United States)

    Mookherji, Sangeeta; Mehl, Garrett; Kaonga, Nadi; Mechael, Patricia

    2015-01-01

    mHealth-the use of mobile technologies for health-is a growing element of health system activity globally, but evaluation of those activities remains quite scant, and remains an important knowledge gap for advancing mHealth activities. In 2010, the World Health Organization and Columbia University implemented a small-scale survey to generate preliminary data on evaluation activities used by mHealth initiatives. The authors describe self-reported data from 69 projects in 29 countries. The majority (74%) reported some sort of evaluation activity, primarily nonexperimental in design (62%). The authors developed a 6-point scale of evaluation rigor comprising information on use of comparison groups, sample size calculation, data collection timing, and randomization. The mean score was low (2.4); half (47%) were conducting evaluations with a minimum threshold (4+) of rigor, indicating use of a comparison group, while less than 20% had randomized the mHealth intervention. The authors were unable to assess whether the rigor score was appropriate for the type of mHealth activity being evaluated. What was clear was that although most data came from mHealth projects pilots aimed for scale-up, few had designed evaluations that would support crucial decisions on whether to scale up and how. Whether the mHealth activity is a strategy to improve health or a tool for achieving intermediate outcomes that should lead to better health, mHealth evaluations must be improved to generate robust evidence for cost-effectiveness assessment and to allow for accurate identification of the contribution of mHealth initiatives to health systems strengthening and the impact on actual health outcomes.

  13. Effect of muscle restraint on sheep meat tenderness with rigor mortis at 18°C.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Wells, Robyn W

    2002-02-01

    The effect on shear force of skeletal restraint and removing muscles from lamb m. longissimus thoracis et lumborum (LT) immediately after slaughter and electrical stimulation was undertaken at a rigor temperature of 18°C (n=15). The temperature of 18°C was achieved through chilling of electrically stimulated sheep carcasses in air at 12°C, air flow 1-1.5 ms(-2). In other groups, the muscle was removed at 2.5 h post-mortem and either wrapped or left non-wrapped before being placed back on the carcass to follow carcass cooling regimes. Following rigor mortis, the meat was aged for 0, 16, 40 and 65 h at 15°C and frozen. For the non-stimulated samples, the meat was aged for 0, 12, 36 and 60 h before being frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1 × 1 cm cross-section. Commencement of ageing was considered to take place at rigor mortis and this was taken as zero aged meat. There were no significant differences in the rate of tenderisation and initial shear force for all treatments. The 23% cook loss was similar for all wrapped and non-wrapped situations and the values decreased slightly with longer ageing durations. Wrapping was shown to mimic meat left intact on the carcass, as it prevented significant prerigor shortening. Such techniques allows muscles to be removed and placed in a controlled temperature environment to enable precise studies of ageing processes.

  14. Study Design Rigor in Animal-Experimental Research Published in Anesthesia Journals.

    Science.gov (United States)

    Hoerauf, Janine M; Moss, Angela F; Fernandez-Bustamante, Ana; Bartels, Karsten

    2018-01-01

    Lack of reproducibility of preclinical studies has been identified as an impediment for translation of basic mechanistic research into effective clinical therapies. Indeed, the National Institutes of Health has revised its grant application process to require more rigorous study design, including sample size calculations, blinding procedures, and randomization steps. We hypothesized that the reporting of such metrics of study design rigor has increased over time for animal-experimental research published in anesthesia journals. PubMed was searched for animal-experimental studies published in 2005, 2010, and 2015 in primarily English-language anesthesia journals. A total of 1466 publications were graded on the performance of sample size estimation, randomization, and blinding. Cochran-Armitage test was used to assess linear trends over time for the primary outcome of whether or not a metric was reported. Interrater agreement for each of the 3 metrics (power, randomization, and blinding) was assessed using the weighted κ coefficient in a 10% random sample of articles rerated by a second investigator blinded to the ratings of the first investigator. A total of 1466 manuscripts were analyzed. Reporting for all 3 metrics of experimental design rigor increased over time (2005 to 2010 to 2015): for power analysis, from 5% (27/516), to 12% (59/485), to 17% (77/465); for randomization, from 41% (213/516), to 50% (243/485), to 54% (253/465); and for blinding, from 26% (135/516), to 38% (186/485), to 47% (217/465). The weighted κ coefficients and 98.3% confidence interval indicate almost perfect agreement between the 2 raters beyond that which occurs by chance alone (power, 0.93 [0.85, 1.0], randomization, 0.91 [0.85, 0.98], and blinding, 0.90 [0.84, 0.96]). Our hypothesis that reported metrics of rigor in animal-experimental studies in anesthesia journals have increased during the past decade was confirmed. More consistent reporting, or explicit justification for absence

  15. Rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets

    International Nuclear Information System (INIS)

    Yang, Z.R.

    1993-10-01

    We have exactly calculated the rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets (SC's) by means of graph expansion and a combinatorial approach and investigated the asymptotic behaviour in the limit of long distance. The result show there is no long range correlation between spins at any finite temperature which indicates no existence of phase transition and thus finally confirms the conclusion produced by the renormalization group method and other physical arguments. (author). 7 refs, 6 figs

  16. An efficient and rigorous thermodynamic library and optimal-control of a cryogenic air separation unit

    DEFF Research Database (Denmark)

    Gaspar, Jozsef; Ritschel, Tobias Kasper Skovborg; Jørgensen, John Bagterp

    2017-01-01

    -linear model based control to achieve optimal techno-economic performance. Accordingly, this work presents a computationally efficient and novel approach for solving a tray-by-tray equilibrium model and its implementation for open-loop optimal-control of a cryogenic distillation column. Here, the optimisation...... objective is to reduce the cost of compression in a volatile electricity market while meeting the production requirements, i.e. product flow rate and purity. This model is implemented in Matlab and uses the ThermoLib rigorous thermodynamic library. The present work represents a first step towards plant...

  17. A study into first-year engineering education success using a rigorous mixed methods approach

    DEFF Research Database (Denmark)

    van den Bogaard, M.E.D.; de Graaff, Erik; Verbraek, Alexander

    2015-01-01

    The aim of this paper is to combine qualitative and quantitative research methods into rigorous research into student success. Research methods have weaknesses that can be overcome by clever combinations. In this paper we use a situated study into student success as an example of how methods...... using statistical techniques. The main elements of the model were student behaviour and student disposition, which were influenced by the students’ perceptions of the education environment. The outcomes of the qualitative studies were useful in interpreting the outcomes of the structural equation...

  18. Supersymmetry and the Parisi-Sourlas dimensional reduction: A rigorous proof

    International Nuclear Information System (INIS)

    Klein, A.; Landau, L.J.; Perez, J.F.

    1984-01-01

    Functional integrals that are formally related to the average correlation functions of a classical field theory in the presence of random external sources are given a rigorous meaning. Their dimensional reduction to the Schwinger functions of the corresponding quantum field theory in two fewer dimensions is proven. This is done by reexpressing those functional integrals as expectations of a supersymmetric field theory. The Parisi-Sourlas dimensional reduction of a supersymmetric field theory to a usual quantum field theory in two fewer dimensions is proven. (orig.)

  19. A Rigorous Treatment of Energy Extraction from a Rotating Black Hole

    Science.gov (United States)

    Finster, F.; Kamran, N.; Smoller, J.; Yau, S.-T.

    2009-05-01

    The Cauchy problem is considered for the scalar wave equation in the Kerr geometry. We prove that by choosing a suitable wave packet as initial data, one can extract energy from the black hole, thereby putting supperradiance, the wave analogue of the Penrose process, into a rigorous mathematical framework. We quantify the maximal energy gain. We also compute the infinitesimal change of mass and angular momentum of the black hole, in agreement with Christodoulou’s result for the Penrose process. The main mathematical tool is our previously derived integral representation of the wave propagator.

  20. Pre-rigor temperature and the relationship between lamb tenderisation, free water production, bound water and dry matter.

    Science.gov (United States)

    Devine, Carrick; Wells, Robyn; Lowe, Tim; Waller, John

    2014-01-01

    The M. longissimus from lambs electrically stimulated at 15 min post-mortem were removed after grading, wrapped in polythene film and held at 4 (n=6), 7 (n=6), 15 (n=6, n=8) and 35°C (n=6), until rigor mortis then aged at 15°C for 0, 4, 24 and 72 h post-rigor. Centrifuged free water increased exponentially, and bound water, dry matter and shear force decreased exponentially over time. Decreases in shear force and increases in free water were closely related (r(2)=0.52) and were unaffected by pre-rigor temperatures. © 2013.

  1. State Skill Standards: Photography

    Science.gov (United States)

    Howell, Frederick; Reed, Loretta; Jensen, Capra; Robison, Gary; Taylor, Susan; Pavesich, Christine

    2007-01-01

    The Department of Education has undertaken an ambitious effort to develop statewide skill standards for all content areas in career and technical education. The standards in this document are for photography programs and are designed to clearly state what the student should know and be able to do upon completion of an advanced high-school program.…

  2. Parent Management Training-Oregon Model: Adapting Intervention with Rigorous Research.

    Science.gov (United States)

    Forgatch, Marion S; Kjøbli, John

    2016-09-01

    Parent Management Training-Oregon Model (PMTO(®) ) is a set of theory-based parenting programs with status as evidence-based treatments. PMTO has been rigorously tested in efficacy and effectiveness trials in different contexts, cultures, and formats. Parents, the presumed agents of change, learn core parenting practices, specifically skill encouragement, limit setting, monitoring/supervision, interpersonal problem solving, and positive involvement. The intervention effectively prevents and ameliorates children's behavior problems by replacing coercive interactions with positive parenting practices. Delivery format includes sessions with individual families in agencies or families' homes, parent groups, and web-based and telehealth communication. Mediational models have tested parenting practices as mechanisms of change for children's behavior and found support for the theory underlying PMTO programs. Moderating effects include children's age, maternal depression, and social disadvantage. The Norwegian PMTO implementation is presented as an example of how PMTO has been tailored to reach diverse populations as delivered by multiple systems of care throughout the nation. An implementation and research center in Oslo provides infrastructure and promotes collaboration between practitioners and researchers to conduct rigorous intervention research. Although evidence-based and tested within a wide array of contexts and populations, PMTO must continue to adapt to an ever-changing world. © 2016 Family Process Institute.

  3. Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound

    Science.gov (United States)

    Shiraishi, Naoto; Tajima, Hiroyasu

    2017-08-01

    A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.

  4. Rigorous RG Algorithms and Area Laws for Low Energy Eigenstates in 1D

    Science.gov (United States)

    Arad, Itai; Landau, Zeph; Vazirani, Umesh; Vidick, Thomas

    2017-11-01

    One of the central challenges in the study of quantum many-body systems is the complexity of simulating them on a classical computer. A recent advance (Landau et al. in Nat Phys, 2015) gave a polynomial time algorithm to compute a succinct classical description for unique ground states of gapped 1D quantum systems. Despite this progress many questions remained unsolved, including whether there exist efficient algorithms when the ground space is degenerate (and of polynomial dimension in the system size), or for the polynomially many lowest energy states, or even whether such states admit succinct classical descriptions or area laws. In this paper we give a new algorithm, based on a rigorously justified RG type transformation, for finding low energy states for 1D Hamiltonians acting on a chain of n particles. In the process we resolve some of the aforementioned open questions, including giving a polynomial time algorithm for poly( n) degenerate ground spaces and an n O(log n) algorithm for the poly( n) lowest energy states (under a mild density condition). For these classes of systems the existence of a succinct classical description and area laws were not rigorously proved before this work. The algorithms are natural and efficient, and for the case of finding unique ground states for frustration-free Hamiltonians the running time is {\\tilde{O}(nM(n))} , where M( n) is the time required to multiply two n × n matrices.

  5. Upgrading geometry conceptual understanding and strategic competence through implementing rigorous mathematical thinking (RMT)

    Science.gov (United States)

    Nugraheni, Z.; Budiyono, B.; Slamet, I.

    2018-03-01

    To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.

  6. "Snow White" Coating Protects SpaceX Dragon's Trunk Against Rigors of Space

    Science.gov (United States)

    McMahan, Tracy

    2013-01-01

    He described it as "snow white." But NASA astronaut Don Pettit was not referring to the popular children's fairy tale. Rather, he was talking about the white coating of the Space Exploration Technologies Corp. (SpaceX) Dragon spacecraft that reflected from the International Space Station s light. As it approached the station for the first time in May 2012, the Dragon s trunk might have been described as the "fairest of them all," for its pristine coating, allowing Pettit to clearly see to maneuver the robotic arm to grab the Dragon for a successful nighttime berthing. This protective thermal control coating, developed by Alion Science and Technology Corp., based in McLean, Va., made its bright appearance again with the March 1 launch of SpaceX's second commercial resupply mission. Named Z-93C55, the coating was applied to the cargo portion of the Dragon to protect it from the rigors of space. "For decades, Alion has produced coatings to protect against the rigors of space," said Michael Kenny, senior chemist with Alion. "As space missions evolved, there was a growing need to dissipate electrical charges that build up on the exteriors of spacecraft, or there could be damage to the spacecraft s electronics. Alion's research led us to develop materials that would meet this goal while also providing thermal controls. The outcome of this research was Alion's proprietary Z-93C55 coating."

  7. Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II

    Energy Technology Data Exchange (ETDEWEB)

    George J. Koperna Jr.; Vello A. Kuuskraa; David E. Riestenberg; Aiysha Sultana; Tyler Van Leeuwen

    2009-06-01

    This report serves as the final technical report and users manual for the 'Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II SBIR project. Advanced Resources International has developed a screening tool by which users can technically screen, assess the storage capacity and quantify the costs of CO2 storage in four types of CO2 storage reservoirs. These include CO2-enhanced oil recovery reservoirs, depleted oil and gas fields (non-enhanced oil recovery candidates), deep coal seems that are amenable to CO2-enhanced methane recovery, and saline reservoirs. The screening function assessed whether the reservoir could likely serve as a safe, long-term CO2 storage reservoir. The storage capacity assessment uses rigorous reservoir simulation models to determine the timing, ultimate storage capacity, and potential for enhanced hydrocarbon recovery. Finally, the economic assessment function determines both the field-level and pipeline (transportation) costs for CO2 sequestration in a given reservoir. The screening tool has been peer reviewed at an Electrical Power Research Institute (EPRI) technical meeting in March 2009. A number of useful observations and recommendations emerged from the Workshop on the costs of CO2 transport and storage that could be readily incorporated into a commercial version of the Screening Tool in a Phase III SBIR.

  8. Differential algebras with remainder and rigorous proofs of long-term stability

    International Nuclear Information System (INIS)

    Berz, Martin

    1997-01-01

    It is shown how in addition to determining Taylor maps of general optical systems, it is possible to obtain rigorous interval bounds for the remainder term of the n-th order Taylor expansion. To this end, the three elementary operations of addition, multiplication, and differentiation in the Differential Algebraic approach are augmented by suitable interval operations in such a way that a remainder bound of the sum, product, and derivative is obtained from the Taylor polynomial and remainder bound of the operands. The method can be used to obtain bounds for the accuracy with which a Taylor map represents the true map of the particle optical system. In a more general sense, it is also useful for a variety of other numerical problems, including rigorous global optimization of highly complex functions. Combined with methods to obtain pseudo-invariants of repetitive motion and extensions of the Lyapunov- and Nekhoroshev stability theory, the latter can be used to guarantee stability for storage rings and other weakly nonlinear systems

  9. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  10. Rigorous Combination of GNSS and VLBI: How it Improves Earth Orientation and Reference Frames

    Science.gov (United States)

    Lambert, S. B.; Richard, J. Y.; Bizouard, C.; Becker, O.

    2017-12-01

    Current reference series (C04) of the International Earth Rotation and Reference Systems Service (IERS) are produced by a weighted combination of Earth orientation parameters (EOP) time series built up by combination centers of each technique (VLBI, GNSS, Laser ranging, DORIS). In the future, we plan to derive EOP from a rigorous combination of the normal equation systems of the four techniques.We present here the results of a rigorous combination of VLBI and GNSS pre-reduced, constraint-free, normal equations with the DYNAMO geodetic analysis software package developed and maintained by the French GRGS (Groupe de Recherche en GeÌodeÌsie Spatiale). The used normal equations are those produced separately by the IVS and IGS combination centers to which we apply our own minimal constraints.We address the usefulness of such a method with respect to the classical, a posteriori, combination method, and we show whether EOP determinations are improved.Especially, we implement external validations of the EOP series based on comparison with geophysical excitation and examination of the covariance matrices. Finally, we address the potential of the technique for the next generation celestial reference frames, which are currently determined by VLBI only.

  11. The International Standards Organisation offshore structures standard

    International Nuclear Information System (INIS)

    Snell, R.O.

    1994-01-01

    The International Standards Organisation has initiated a program to develop a suite of ISO Codes and Standards for the Oil Industry. The Offshore Structures Standard is one of seven topics being addressed. The scope of the standard will encompass fixed steel and concrete structures, floating structures, Arctic structures and the site specific assessment of mobile drilling and accommodation units. The standard will use as base documents the existing recommended practices and standards most frequently used for each type of structure, and will develop them to incorporate best published and recognized practice and knowledge where it provides a significant improvement on the base document. Work on the Code has commenced under the direction of an internationally constituted sub-committee comprising representatives from most of the countries with a substantial offshore oil and gas industry. This paper outlines the background to the code and the format, content and work program

  12. Advertising Content

    OpenAIRE

    Simon P. Anderson; Régis Renault

    2002-01-01

    Empirical evidence suggests that most advertisements contain little direct informa- tion. Many do not mention prices. We analyze a firm'ss choice of advertising content and the information disclosed to consumers. A firm advertises only product informa- tion, price information, or both; and prefers to convey only limited product information if possible. Extending the "persuasion" game, we show that quality information takes precedence over price information and horizontal product information.T...

  13. Technical note - the incorporation of ash content into gas content

    Energy Technology Data Exchange (ETDEWEB)

    Creech, M.; Mahoney, M. [Powercoal Pty. Ltd., Budgewoi, NSW (Australia)

    1995-12-31

    For gas content analysis in recent years, it has been standard procedure to report results on a `dry ash free` (daf) basis, under the assumption that gas only adsorbs onto coaly material. In order to test the relationship between ash and gas content, samples of various rock types were taken from two drillholes in the Newcastle Coalfield. The results of this study confirmed the correlation between gas content and ash, providing an accurate means of relating gas contents for all carbonaceous rock types. 4 refs., 5 figs., 1 tab.

  14. Applicability of the ''review of content'' under the law on standard terms and conditions to power availability and power supply contracts between producers and resellers; Die Anwendbarkeit der AGB-rechtlichen Inhaltskontrolle auf Leistungsvorhaltungs- und Stromliefervertraege zwischen Erzeuger und Weiterverkaeufer

    Energy Technology Data Exchange (ETDEWEB)

    Putzka, Florian

    2009-07-01

    This study deals with the applicability under the law on standard terms and conditions of the ''review of content'' to power availability and power supply contracts between producers and resellers. It addresses fundamental issues concerning the encroachment on the private autonomy of those involved as well as the hazard to the contract certainty of their contracts that is posed by the instrument of judicial review of content under the law on standard terms and conditions. The purpose of the study is to determine to what extent power availability and power supply contracts between producers and resellers are subject to a review of content by a civil judge, be it on the strength of the law on standard terms and conditions or through what is known as an extended review of content. Power supply contracts between power suppliers and final customers are left out of consideration here, even though the authors are aware of the fact that as a result of the amendment to the Energy Economy Law in 2005 and the ordinances following in its wake it is becoming more and more common for such contracts to be amenable to the law on standard terms and conditions.

  15. Dosimetric effects of edema in permanent prostate seed implants: a rigorous solution

    International Nuclear Information System (INIS)

    Chen Zhe; Yue Ning; Wang Xiaohong; Roberts, Kenneth B.; Peschel, Richard; Nath, Ravinder

    2000-01-01

    Purpose: To derive a rigorous analytic solution to the dosimetric effects of prostate edema so that its impact on the conventional pre-implant and post-implant dosimetry can be studied for any given radioactive isotope and edema characteristics. Methods and Materials: The edema characteristics observed by Waterman et al (Int. J. Rad. Onc. Biol. Phys, 41:1069-1077; 1998) was used to model the time evolution of the prostate and the seed locations. The total dose to any part of prostate tissue from a seed implant was calculated analytically by parameterizing the dose fall-off from a radioactive seed as a single inverse power function of distance, with proper account of the edema-induced time evolution. The dosimetric impact of prostate edema was determined by comparing the dose calculated with full consideration of prostate edema to that calculated with the conventional dosimetry approach where the seed locations and the target volume are assumed to be stationary. Results: A rigorous analytic solution on the relative dosimetric effects of prostate edema was obtained. This solution proved explicitly that the relative dosimetric effects of edema, as found in the previous numerical studies by Yue et. al. (Int. J. Radiat. Oncol. Biol. Phys. 43, 447-454, 1999), are independent of the size and the shape of the implant target volume and are independent of the number and the locations of the seeds implanted. It also showed that the magnitude of relative dosimetric effects is independent of the location of dose evaluation point within the edematous target volume. It implies that the relative dosimetric effects of prostate edema are universal with respect to a given isotope and edema characteristic. A set of master tables for the relative dosimetric effects of edema were obtained for a wide range of edema characteristics for both 125 I and 103 Pd prostate seed implants. Conclusions: A rigorous analytic solution of the relative dosimetric effects of prostate edema has been

  16. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    Science.gov (United States)

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (Prigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (Prigor fillets (37.8 ± 0.8) and had significantly lower (Prigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  17. Paulo Leminski : um estudo sobre o rigor e o relaxo em suas poesias

    OpenAIRE

    Dhynarte de Borba e Albuquerque

    2005-01-01

    O trabalho examina a trajetória da poesia de Paulo Leminski, buscando estabelecer os termos do humor, da pesquisa metalingüística e do eu-lírico, e que não deixa de exibir traços da poesia marginal dos 70. Um autor que trabalhou com a busca do rigor concretista mediante os procedimentos da fala cotidiana mais ou menos relaxada. O esforço poético do curitibano Leminski é uma “linha que nunca termina” – ele escreveu poesias, romances, peças de publicidade, letras de música e fez traduções. Em t...

  18. Rigorous decoupling between edge states in frustrated spin chains and ladders

    Science.gov (United States)

    Chepiga, Natalia; Mila, Frédéric

    2018-05-01

    We investigate the occurrence of exact zero modes in one-dimensional quantum magnets of finite length that possess edge states. Building on conclusions first reached in the context of the spin-1/2 X Y chain in a field and then for the spin-1 J1-J2 Heisenberg model, we show that the development of incommensurate correlations in the bulk invariably leads to oscillations in the sign of the coupling between edge states, and hence to exact zero energy modes at the crossing points where the coupling between the edge states rigorously vanishes. This is true regardless of the origin of the frustration (e.g., next-nearest-neighbor coupling or biquadratic coupling for the spin-1 chain), of the value of the bulk spin (we report on spin-1/2, spin-1, and spin-2 examples), and of the value of the edge-state emergent spin (spin-1/2 or spin-1).

  19. Using Project Complexity Determinations to Establish Required Levels of Project Rigor

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Thomas D.

    2015-10-01

    This presentation discusses the project complexity determination process that was developed by National Security Technologies, LLC, for the U.S. Department of Energy, National Nuclear Security Administration Nevada Field Office for implementation at the Nevada National Security Site (NNSS). The complexity determination process was developed to address the diversity of NNSS project types, size, and complexity; to fill the need for one procedure but with provision for tailoring the level of rigor to the project type, size, and complexity; and to provide consistent, repeatable, effective application of project management processes across the enterprise; and to achieve higher levels of efficiency in project delivery. These needs are illustrated by the wide diversity of NNSS projects: Defense Experimentation, Global Security, weapons tests, military training areas, sensor development and testing, training in realistic environments, intelligence community support, sensor development, environmental restoration/waste management, and disposal of radioactive waste, among others.

  20. Guidelines for conducting rigorous health care psychosocial cross-cultural/language qualitative research.

    Science.gov (United States)

    Arriaza, Pablo; Nedjat-Haiem, Frances; Lee, Hee Yun; Martin, Shadi S

    2015-01-01

    The purpose of this article is to synthesize and chronicle the authors' experiences as four bilingual and bicultural researchers, each experienced in conducting cross-cultural/cross-language qualitative research. Through narrative descriptions of experiences with Latinos, Iranians, and Hmong refugees, the authors discuss their rewards, challenges, and methods of enhancing rigor, trustworthiness, and transparency when conducting cross-cultural/cross-language research. The authors discuss and explore how to effectively manage cross-cultural qualitative data, how to effectively use interpreters and translators, how to identify best methods of transcribing data, and the role of creating strong community relationships. The authors provide guidelines for health care professionals to consider when engaging in cross-cultural qualitative research.

  1. Release of major ions during rigor mortis development in kid Longissimus dorsi muscle.

    Science.gov (United States)

    Feidt, C; Brun-Bellut, J

    1999-01-01

    Ionic strength plays an important role in post mortem muscle changes. Its increase is due to ion release during the development of rigor mortis. Twelve alpine kids were used to study the effects of chilling and meat pH on ion release. Free ions were measured in Longissimus dorsi muscle by capillary electrophoresis after water extraction. All free ion concentrations increased after death, but there were differences between ions. Temperature was not a factor affecting ion release in contrast to ultimate pH value. Three release mechanisms are believed to coexist: a passive binding to proteins, which stops as pH decreases, an active segregation which stops as ATP disappears and the production of metabolites due to anaerobic glycolysis.

  2. Revisiting the scientific method to improve rigor and reproducibility of immunohistochemistry in reproductive science.

    Science.gov (United States)

    Manuel, Sharrón L; Johnson, Brian W; Frevert, Charles W; Duncan, Francesca E

    2018-04-21

    Immunohistochemistry (IHC) is a robust scientific tool whereby cellular components are visualized within a tissue, and this method has been and continues to be a mainstay for many reproductive biologists. IHC is highly informative if performed and interpreted correctly, but studies have shown that the general use and reporting of appropriate controls in IHC experiments is low. This omission of the scientific method can result in data that lacks rigor and reproducibility. In this editorial, we highlight key concepts in IHC controls and describe an opportunity for our field to partner with the Histochemical Society to adopt their IHC guidelines broadly as researchers, authors, ad hoc reviewers, editorial board members, and editors-in-chief. Such cross-professional society interactions will ensure that we produce the highest quality data as new technologies emerge that still rely upon the foundations of classic histological and immunohistochemical principles.

  3. Rigorous approach to the comparison between experiment and theory in Casimir force measurements

    International Nuclear Information System (INIS)

    Klimchitskaya, G L; Chen, F; Decca, R S; Fischbach, E; Krause, D E; Lopez, D; Mohideen, U; Mostepanenko, V M

    2006-01-01

    In most experiments on the Casimir force the comparison between measurement data and theory was done using the concept of the root-mean-square deviation, a procedure that has been criticized in the literature. Here we propose a special statistical analysis which should be performed separately for the experimental data and for the results of the theoretical computations. In so doing, the random, systematic and total experimental errors are found as functions of separation, taking into account the distribution laws for each error at 95% confidence. Independently, all theoretical errors are combined to obtain the total theoretical error at the same confidence. Finally, the confidence interval for the differences between theoretical and experimental values is obtained as a function of separation. This rigorous approach is applied to two recent experiments on the Casimir effect

  4. Direct integration of the S-matrix applied to rigorous diffraction

    International Nuclear Information System (INIS)

    Iff, W; Lindlein, N; Tishchenko, A V

    2014-01-01

    A novel Fourier method for rigorous diffraction computation at periodic structures is presented. The procedure is based on a differential equation for the S-matrix, which allows direct integration of the S-matrix blocks. This results in a new method in Fourier space, which can be considered as a numerically stable and well-parallelizable alternative to the conventional differential method based on T-matrix integration and subsequent conversions from the T-matrices to S-matrix blocks. Integration of the novel differential equation in implicit manner is expounded. The applicability of the new method is shown on the basis of 1D periodic structures. It is clear however, that the new technique can also be applied to arbitrary 2D periodic or periodized structures. The complexity of the new method is O(N 3 ) similar to the conventional differential method with N being the number of diffraction orders. (fast track communication)

  5. Rigorous description of holograms of particles illuminated by an astigmatic elliptical Gaussian beam

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Y J; Ren, K F; Coetmellec, S; Lebrun, D, E-mail: fang.ren@coria.f [UMR 6614/CORIA, CNRS and Universite et INSA de Rouen Avenue de l' Universite BP 12, 76801 Saint Etienne du Rouvray (France)

    2009-02-01

    The digital holography is a non-intrusive optical metrology and well adapted for the measurement of the size and velocity field of particles in the spray of a fluid. The simplified model of an opaque disk is often used in the treatment of the diagrams and therefore the refraction and the third dimension diffraction of the particle are not taken into account. We present in this paper a rigorous description of the holographic diagrams and evaluate the effects of the refraction and the third dimension diffraction by comparison to the opaque disk model. It is found that the effects are important when the real part of the refractive index is near unity or the imaginary part is non zero but small.

  6. A new method for deriving rigorous results on ππ scattering

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.

    1979-06-01

    We develop a new approach to the problem of constraining the ππ scattering amplitudes by means of the axiomatically proved properties of unitarity, analyticity and crossing symmetry. The method is based on the solution of an extremal problem on a convex set of analytic functions and provides a global description of the domain of values taken by any finite number of partial waves at an arbitrary set of unphysical energies, compatible with unitarity, the bounds at complex energies derived from generalized dispersion relations and the crossing integral relations. From this doma domain we obtain new absolute bounds for the amplitudes as well as rigorous correlations between the values of various partial waves. (author)

  7. Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation

    Science.gov (United States)

    Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe

    2018-04-01

    In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.

  8. Study design elements for rigorous quasi-experimental comparative effectiveness research.

    Science.gov (United States)

    Maciejewski, Matthew L; Curtis, Lesley H; Dowd, Bryan

    2013-03-01

    Quasi-experiments are likely to be the workhorse study design used to generate evidence about the comparative effectiveness of alternative treatments, because of their feasibility, timeliness, affordability and external validity compared with randomized trials. In this review, we outline potential sources of discordance in results between quasi-experiments and experiments, review study design choices that can improve the internal validity of quasi-experiments, and outline innovative data linkage strategies that may be particularly useful in quasi-experimental comparative effectiveness research. There is an urgent need to resolve the debate about the evidentiary value of quasi-experiments since equal consideration of rigorous quasi-experiments will broaden the base of evidence that can be brought to bear in clinical decision-making and governmental policy-making.

  9. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    Science.gov (United States)

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  10. A rigorous phenomenological analysis of the ππ scattering lengths

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.; Sararu, M.

    1979-11-01

    The constraining power of the present experimental data, combined with the general theoretical knowledge about ππ scattering, upon the scattering lengths of this process, is investigated by means of a rigorous functional method. We take as input the experimental phase shifts and make no hypotheses about the high energy behaviour of the amplitudes, using only absolute bounds derived from axiomatic field theory and exact consequences of crossing symmetry. In the simplest application of the method, involving only the π 0 π 0 S-wave, we explored numerically a number of values proposed by various authors for the scattering lengths a 0 and a 2 and found that no one appears to be especially favoured. (author)

  11. Early rigorous control interventions can largely reduce dengue outbreak magnitude: experience from Chaozhou, China.

    Science.gov (United States)

    Liu, Tao; Zhu, Guanghu; He, Jianfeng; Song, Tie; Zhang, Meng; Lin, Hualiang; Xiao, Jianpeng; Zeng, Weilin; Li, Xing; Li, Zhihao; Xie, Runsheng; Zhong, Haojie; Wu, Xiaocheng; Hu, Wenbiao; Zhang, Yonghui; Ma, Wenjun

    2017-08-02

    Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6%) in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI) data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed) model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250). The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.

  12. Bringing scientific rigor to community-developed programs in Hong Kong

    Directory of Open Access Journals (Sweden)

    Fabrizio Cecilia S

    2012-12-01

    Full Text Available Abstract Background This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR. Methods The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Results Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. Conclusions The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  13. Rigorous Multicomponent Reactive Separations Modelling: Complete Consideration of Reaction-Diffusion Phenomena

    International Nuclear Information System (INIS)

    Ahmadi, A.; Meyer, M.; Rouzineau, D.; Prevost, M.; Alix, P.; Laloue, N.

    2010-01-01

    This paper gives the first step of the development of a rigorous multicomponent reactive separation model. Such a model is highly essential to further the optimization of acid gases removal plants (CO 2 capture, gas treating, etc.) in terms of size and energy consumption, since chemical solvents are conventionally used. Firstly, two main modelling approaches are presented: the equilibrium-based and the rate-based approaches. Secondly, an extended rate-based model with rigorous modelling methodology for diffusion-reaction phenomena is proposed. The film theory and the generalized Maxwell-Stefan equations are used in order to characterize multicomponent interactions. The complete chain of chemical reactions is taken into account. The reactions can be kinetically controlled or at chemical equilibrium, and they are considered for both liquid film and liquid bulk. Thirdly, the method of numerical resolution is described. Coupling the generalized Maxwell-Stefan equations with chemical equilibrium equations leads to a highly non-linear Differential-Algebraic Equations system known as DAE index 3. The set of equations is discretized with finite-differences as its integration by Gear method is complex. The resulting algebraic system is resolved by the Newton- Raphson method. Finally, the present model and the associated methods of numerical resolution are validated for the example of esterification of methanol. This archetype non-electrolytic system permits an interesting analysis of reaction impact on mass transfer, especially near the phase interface. The numerical resolution of the model by Newton-Raphson method gives good results in terms of calculation time and convergence. The simulations show that the impact of reactions at chemical equilibrium and that of kinetically controlled reactions with high kinetics on mass transfer is relatively similar. Moreover, the Fick's law is less adapted for multicomponent mixtures where some abnormalities such as counter

  14. Rigorous upper bounds for transport due to passive advection by inhomogeneous turbulence

    International Nuclear Information System (INIS)

    Krommes, J.A.; Smith, R.A.

    1987-05-01

    A variational procedure, due originally to Howard and explored by Busse and others for self-consistent turbulence problems, is employed to determine rigorous upper bounds for the advection of a passive scalar through an inhomogeneous turbulent slab with arbitrary generalized Reynolds number R and Kubo number K. In the basic version of the method, the steady-state energy balance is used as a constraint; the resulting bound, though rigorous, is independent of K. A pedagogical reference model (one dimension, K = ∞) is described in detail; the bound compares favorably with the exact solution. The direct-interaction approximation is also worked out for this model; it is somewhat more accurate than the bound, but requires considerably more labor to solve. For the basic bound, a general formalism is presented for several dimensions, finite correlation length, and reasonably general boundary conditions. Part of the general method, in which a Green's function technique is employed, applies to self-consistent as well as to passive problems, and thereby generalizes previous results in the fluid literature. The formalism is extended for the first time to include time-dependent constraints, and a bound is deduced which explicitly depends on K and has the correct physical scalings in all regimes of R and K. Two applications from the theory of turbulent plasmas ae described: flux in velocity space, and test particle transport in stochastic magnetic fields. For the velocity space problem the simplest bound reproduces Dupree's original scaling for the strong turbulence diffusion coefficient. For the case of stochastic magnetic fields, the scaling of the bounds is described for the magnetic diffusion coefficient as well as for the particle diffusion coefficient in the so-called collisionless, fluid, and double-streaming regimes

  15. Bringing scientific rigor to community-developed programs in Hong Kong.

    Science.gov (United States)

    Fabrizio, Cecilia S; Hirschmann, Malia R; Lam, Tai Hing; Cheung, Teresa; Pang, Irene; Chan, Sophia; Stewart, Sunita M

    2012-12-31

    This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR). The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  16. Rigor mortis development at elevated temperatures induces pale exudative turkey meat characteristics.

    Science.gov (United States)

    McKee, S R; Sams, A R

    1998-01-01

    Development of rigor mortis at elevated post-mortem temperatures may contribute to turkey meat characteristics that are similar to those found in pale, soft, exudative pork. To evaluate this effect, 36 Nicholas tom turkeys were processed at 19 wk of age and placed in water at 40, 20, and 0 C immediately after evisceration. Pectoralis muscle samples were taken at 15 min, 30 min, 1 h, 2 h, and 4 h post-mortem and analyzed for R-value (an indirect measure of adenosine triphosphate), glycogen, pH, color, and sarcomere length. At 4 h, the remaining intact Pectoralis muscle was harvested, and aged on ice 23 h, and analyzed for drip loss, cook loss, shear values, and sarcomere length. By 15 min post-mortem, the 40 C treatment had higher R-values, which persisted through 4 h. By 1 h, the 40 C treatment pH and glycogen levels were lower than the 0 C treatment; however, they did not differ from those of the 20 C treatment. Increased L* values indicated that color became more pale by 2 h post-mortem in the 40 C treatment when compared to the 20 and 0 C treatments. Drip loss, cook loss, and shear value were increased whereas sarcomere lengths were decreased as a result of the 40 C treatment. These findings suggested that elevated post-mortem temperatures during processing resulted in acceleration of rigor mortis and biochemical changes in the muscle that produced pale, exudative meat characteristics in turkey.

  17. Early rigorous control interventions can largely reduce dengue outbreak magnitude: experience from Chaozhou, China

    Directory of Open Access Journals (Sweden)

    Tao Liu

    2017-08-01

    Full Text Available Abstract Background Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? Methods We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6% in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. Results A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250. The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. Conclusions This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.

  18. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  19. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  20. Guest Editorial: The "NGSS" Case Studies: All Standards, All Students

    Science.gov (United States)

    Miller, Emily; Januszyk, Rita

    2014-01-01

    To teachers of diverse classrooms, more rigorous standards in science may seem intimidating, as the past years of rigid accountability have failed to close the achievement gap. However, the "Next Generation Science Standards" ("NGSS") were written with all students in mind, with input and full review by a diversity and equity…

  1. The Standard Model course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    Suggested Readings: Aspects of Quantum Chromodynamics/A Pich, arXiv:hep-ph/0001118. - The Standard Model of Electroweak Interactions/A Pich, arXiv:hep-ph/0502010. - The Standard Model of Particle Physics/A Pich The Standard Model of Elementary Particle Physics will be described. A detailed discussion of the particle content, structure and symmetries of the theory will be given, together with an overview of the most important experimental facts which have established this theoretical framework as the Standard Theory of particle interactions.

  2. The effect of rigor mortis on the passage of erythrocytes and fluid through the myocardium of isolated dog hearts.

    Science.gov (United States)

    Nevalainen, T J; Gavin, J B; Seelye, R N; Whitehouse, S; Donnell, M

    1978-07-01

    The effect of normal and artificially induced rigor mortis on the vascular passage of erythrocytes and fluid through isolated dog hearts was studied. Increased rigidity of 6-mm thick transmural sections through the centre of the posterior papillary muscle was used as an indication of rigor. The perfusibility of the myocardium was tested by injecting 10 ml of 1% sodium fluorescein in Hanks solution into the circumflex branch of the left coronary artery. In prerigor hearts (20 minute incubation) fluorescein perfused the myocardium evenly whether or not it was preceded by an injection of 10 ml of heparinized dog blood. Rigor mortis developed in all hearts after 90 minutes incubation or within 20 minutes of perfusing the heart with 50 ml of 5 mM iodoacetate in Hanks solution. Fluorescein injected into hearts in rigor did not enter the posterior papillary muscle and adjacent subendocardium whether or not it was preceded by heparinized blood. Thus the vascular occlusion caused by rigor in the dog heart appears to be so effective that it prevents flow into the subendocardium of small soluble ions such as fluorescein.

  3. Layout optimization of DRAM cells using rigorous simulation model for NTD

    Science.gov (United States)

    Jeon, Jinhyuck; Kim, Shinyoung; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Kuechler, Bernd; Zimmermann, Rainer; Muelders, Thomas; Klostermann, Ulrich; Schmoeller, Thomas; Do, Mun-hoe; Choi, Jung-Hoe

    2014-03-01

    scanning electron microscope (SEM) measurements. High resist impact and difficult model data acquisition demand for a simulation model that hat is capable of extrapolating reliably beyond its calibration dataset. We use rigorous simulation models to provide that predictive performance. We have discussed the need of a rigorous mask optimization process for DRAM contact cell layout yielding mask layouts that are optimal in process performance, mask manufacturability and accuracy. In this paper, we have shown the step by step process from analytical illumination source derivation, a NTD and application tailored model calibration to layout optimization such as OPC and SRAF placement. Finally the work has been verified with simulation and experimental results on wafer.

  4. A CUMULATIVE MIGRATION METHOD FOR COMPUTING RIGOROUS TRANSPORT CROSS SECTIONS AND DIFFUSION COEFFICIENTS FOR LWR LATTICES WITH MONTE CARLO

    Energy Technology Data Exchange (ETDEWEB)

    Zhaoyuan Liu; Kord Smith; Benoit Forget; Javier Ortensi

    2016-05-01

    A new method for computing homogenized assembly neutron transport cross sections and dif- fusion coefficients that is both rigorous and computationally efficient is proposed in this paper. In the limit of a homogeneous hydrogen slab, the new method is equivalent to the long-used, and only-recently-published CASMO transport method. The rigorous method is used to demonstrate the sources of inaccuracy in the commonly applied “out-scatter” transport correction. It is also demonstrated that the newly developed method is directly applicable to lattice calculations per- formed by Monte Carlo and is capable of computing rigorous homogenized transport cross sections for arbitrarily heterogeneous lattices. Comparisons of several common transport cross section ap- proximations are presented for a simple problem of infinite medium hydrogen. The new method has also been applied in computing 2-group diffusion data for an actual PWR lattice from BEAVRS benchmark.

  5. Building an Evidence Base to Inform Interventions for Pregnant and Parenting Adolescents: A Call for Rigorous Evaluation

    Science.gov (United States)

    Burrus, Barri B.; Scott, Alicia Richmond

    2012-01-01

    Adolescent parents and their children are at increased risk for adverse short- and long-term health and social outcomes. Effective interventions are needed to support these young families. We studied the evidence base and found a dearth of rigorously evaluated programs. Strategies from successful interventions are needed to inform both intervention design and policies affecting these adolescents. The lack of rigorous evaluations may be attributable to inadequate emphasis on and sufficient funding for evaluation, as well as to challenges encountered by program evaluators working with this population. More rigorous program evaluations are urgently needed to provide scientifically sound guidance for programming and policy decisions. Evaluation lessons learned have implications for other vulnerable populations. PMID:22897541

  6. Revisiting the constant growth angle: Estimation and verification via rigorous thermal modeling

    Science.gov (United States)

    Virozub, Alexander; Rasin, Igal G.; Brandon, Simon

    2008-12-01

    Methods for estimating growth angle ( θgr) values, based on the a posteriori analysis of directionally solidified material (e.g. drops) often involve assumptions of negligible gravitational effects as well as a planar solid/liquid interface during solidification. We relax both of these assumptions when using experimental drop shapes from the literature to estimate the relevant growth angles at the initial stages of solidification. Assumed to be constant, we use these values as input into a rigorous heat transfer and solidification model of the growth process. This model, which is shown to reproduce the experimental shape of a solidified sessile water drop using the literature value of θgr=0∘, yields excellent agreement with experimental profiles using our estimated values for silicon ( θgr=10∘) and germanium ( θgr=14.3∘) solidifying on an isotropic crystalline surface. The effect of gravity on the solidified drop shape is found to be significant in the case of germanium, suggesting that gravity should either be included in the analysis or that care should be taken that the relevant Bond number is truly small enough in each measurement. The planar solidification interface assumption is found to be unjustified. Although this issue is important when simulating the inflection point in the profile of the solidified water drop, there are indications that solidified drop shapes (at least in the case of silicon) may be fairly insensitive to the shape of this interface.

  7. A Rigorous Investigation on the Ground State of the Penson-Kolb Model

    Science.gov (United States)

    Yang, Kai-Hua; Tian, Guang-Shan; Han, Ru-Qi

    2003-05-01

    By using either numerical calculations or analytical methods, such as the bosonization technique, the ground state of the Penson-Kolb model has been previously studied by several groups. Some physicists argued that, as far as the existence of superconductivity in this model is concerned, it is canonically equivalent to the negative-U Hubbard model. However, others did not agree. In the present paper, we shall investigate this model by an independent and rigorous approach. We show that the ground state of the Penson-Kolb model is nondegenerate and has a nonvanishing overlap with the ground state of the negative-U Hubbard model. Furthermore, we also show that the ground states of both the models have the same good quantum numbers and may have superconducting long-range order at the same momentum q = 0. Our results support the equivalence between these models. The project partially supported by the Special Funds for Major State Basic Research Projects (G20000365) and National Natural Science Foundation of China under Grant No. 10174002

  8. Complexities and Controversies in Himalayan Research: A Call for Collaboration and Rigor for Better Data

    Directory of Open Access Journals (Sweden)

    Surendra P. Singh

    2015-11-01

    Full Text Available The Himalaya range encompasses enormous variation in elevation, precipitation, biodiversity, and patterns of human livelihoods. These mountains modify the regional climate in complex ways; the ecosystem services they provide influence the lives of almost 1 billion people in 8 countries. However, our understanding of these ecosystems remains rudimentary. The 2007 Intergovernmental Panel on Climate Change report that erroneously predicted a date for widespread glacier loss exposed how little was known of Himalayan glaciers. Recent research shows how variably glaciers respond to climate change in different Himalayan regions. Alarmist theories are not new. In the 1980s, the Theory of Himalayan Degradation warned of complete forest loss and devastation of downstream areas, an eventuality that never occurred. More recently, the debate on hydroelectric construction appears driven by passions rather than science. Poor data, hasty conclusions, and bad science plague Himalayan research. Rigorous sampling, involvement of civil society in data collection, and long-term collaborative research involving institutions from across the Himalaya are essential to improve knowledge of this region.

  9. Rigorous Mathematical Thinking Approach to Enhance Students’ Mathematical Creative and Critical Thinking Abilities

    Science.gov (United States)

    Hidayat, D.; Nurlaelah, E.; Dahlan, J. A.

    2017-09-01

    The ability of mathematical creative and critical thinking are two abilities that need to be developed in the learning of mathematics. Therefore, efforts need to be made in the design of learning that is capable of developing both capabilities. The purpose of this research is to examine the mathematical creative and critical thinking ability of students who get rigorous mathematical thinking (RMT) approach and students who get expository approach. This research was quasi experiment with control group pretest-posttest design. The population were all of students grade 11th in one of the senior high school in Bandung. The result showed that: the achievement of mathematical creative and critical thinking abilities of student who obtain RMT is better than students who obtain expository approach. The use of Psychological tools and mediation with criteria of intentionality, reciprocity, and mediated of meaning on RMT helps students in developing condition in critical and creative processes. This achievement contributes to the development of integrated learning design on students’ critical and creative thinking processes.

  10. A Rigorous Theory of Many-Body Prethermalization for Periodically Driven and Closed Quantum Systems

    Science.gov (United States)

    Abanin, Dmitry; De Roeck, Wojciech; Ho, Wen Wei; Huveneers, François

    2017-09-01

    Prethermalization refers to the transient phenomenon where a system thermalizes according to a Hamiltonian that is not the generator of its evolution. We provide here a rigorous framework for quantum spin systems where prethermalization is exhibited for very long times. First, we consider quantum spin systems under periodic driving at high frequency {ν}. We prove that up to a quasi-exponential time {τ_* ˜ e^{c ν/log^3 ν}}, the system barely absorbs energy. Instead, there is an effective local Hamiltonian {\\widehat D} that governs the time evolution up to {τ_*}, and hence this effective Hamiltonian is a conserved quantity up to {τ_*}. Next, we consider systems without driving, but with a separation of energy scales in the Hamiltonian. A prime example is the Fermi-Hubbard model where the interaction U is much larger than the hopping J. Also here we prove the emergence of an effective conserved quantity, different from the Hamiltonian, up to a time {τ_*} that is (almost) exponential in {U/J}.

  11. Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death

    Directory of Open Access Journals (Sweden)

    Evgeniy R. Galimov

    2018-03-01

    Full Text Available Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence. Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC. This phenomenon is accompanied by a wave of intramuscular Ca2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death.

  12. Inosine-5'-monophosphate is a candidate agent to resolve rigor mortis of skeletal muscle.

    Science.gov (United States)

    Matsuishi, Masanori; Tsuji, Mariko; Yamaguchi, Megumi; Kitamura, Natsumi; Tanaka, Sachi; Nakamura, Yukinobu; Okitani, Akihiro

    2016-11-01

    The object of the present study was to reveal the action of inosine-5'-monophosphate (IMP) toward myofibrils in postmortem muscles. IMP solubilized isolated actomyosin within a narrow range of KCl concentration, 0.19-0.20 mol/L, because of the dissociation of actomyosin into actin and myosin, but it did not solubilize the proteins in myofibrils with 0.2 mol/L KCl. However, IMP could solubilize both proteins in myofibrils with 0.2 mol/L KCl in the presence of 1 m mol/L pyrophosphate or 1.0-3.3 m mol/L adenosine-5'-diphosphate (ADP). Thus, we presumed that pyrophosphate and ADP released thin filaments composed of actin, and thick filaments composed of myosin from restraints of myofibrils, and then both filaments were solubilized through the IMP-induced dissociation of actomyosin. Thus, we concluded that IMP is a candidate agent to resolve rigor mortis because of its ability to break the association between thick and thin filaments. © 2016 Japanese Society of Animal Science.

  13. Alternative pre-rigor foreshank positioning can improve beef shoulder muscle tenderness.

    Science.gov (United States)

    Grayson, A L; Lawrence, T E

    2013-09-01

    Thirty beef carcasses were harvested and the foreshank of each side was independently positioned (cranial, natural, parallel, or caudal) 1h post-mortem to determine the effect of foreshank angle at rigor mortis on the sarcomere length and tenderness of six beef shoulder muscles. The infraspinatus (IS), pectoralis profundus (PP), serratus ventralis (SV), supraspinatus (SS), teres major (TM) and triceps brachii (TB) were excised 48 h post-mortem for Warner-Bratzler shear force (WBSF) and sarcomere length evaluations. All muscles except the SS had altered (P<0.05) sarcomere lengths between positions; the cranial position resulted in the longest sarcomeres for the SV and TB muscles whilst the natural position had longer sarcomeres for the PP and TM muscles. The SV from the cranial position had lower (P<0.05) shear than the caudal position and TB from the natural position had lower (P<0.05) shear than the parallel or caudal positions. Sarcomere length was moderately correlated (r=-0.63; P<0.01) to shear force. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Coupling of Rigor Mortis and Intestinal Necrosis during C. elegans Organismal Death.

    Science.gov (United States)

    Galimov, Evgeniy R; Pryor, Rosina E; Poole, Sarah E; Benedetto, Alexandre; Pincus, Zachary; Gems, David

    2018-03-06

    Organismal death is a process of systemic collapse whose mechanisms are less well understood than those of cell death. We previously reported that death in C. elegans is accompanied by a calcium-propagated wave of intestinal necrosis, marked by a wave of blue autofluorescence (death fluorescence). Here, we describe another feature of organismal death, a wave of body wall muscle contraction, or death contraction (DC). This phenomenon is accompanied by a wave of intramuscular Ca 2+ release and, subsequently, of intestinal necrosis. Correlation of directions of the DC and intestinal necrosis waves implies coupling of these death processes. Long-lived insulin/IGF-1-signaling mutants show reduced DC and delayed intestinal necrosis, suggesting possible resistance to organismal death. DC resembles mammalian rigor mortis, a postmortem necrosis-related process in which Ca 2+ influx promotes muscle hyper-contraction. In contrast to mammals, DC is an early rather than a late event in C. elegans organismal death. VIDEO ABSTRACT. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  15. Rigor mortis and the epileptology of Charles Bland Radcliffe (1822-1889).

    Science.gov (United States)

    Eadie, M J

    2007-03-01

    Charles Bland Radcliffe (1822-1889) was one of the physicians who made major contributions to the literature on epilepsy in the mid-19th century, when the modern understanding of the disorder was beginning to emerge, particularly in England. His experimental work was concerned with the electrical properties of frog muscle and nerve. Early in his career he related his experimental findings to the phenomenon of rigor mortis and concluded that, contrary to the general belief of the time, muscle contraction depended on the cessation of nerve input, and muscle relaxation on its presence. He adhered to this counter-intuitive interpretation throughout his life and, based on it, produced an epileptology that was very different from those of his contemporaries and successors. His interpretations were ultimately without any direct influence on the advance of knowledge. However, his idea that withdrawal of an inhibitory process released previously suppressed muscular contractile powers, when applied to the brain rather than the periphery of the nervous system, permitted Hughlings Jackson to explain certain psychological phenomena that accompany or follow some epileptic events. As well, Radcliffe was one of the chief early advocates for potassium bromide, the first effective anticonvulsant.

  16. Improving students’ mathematical critical thinking through rigorous teaching and learning model with informal argument

    Science.gov (United States)

    Hamid, H.

    2018-01-01

    The purpose of this study is to analyze an improvement of students’ mathematical critical thinking (CT) ability in Real Analysis course by using Rigorous Teaching and Learning (RTL) model with informal argument. In addition, this research also attempted to understand students’ CT on their initial mathematical ability (IMA). This study was conducted at a private university in academic year 2015/2016. The study employed the quasi-experimental method with pretest-posttest control group design. The participants of the study were 83 students in which 43 students were in the experimental group and 40 students were in the control group. The finding of the study showed that students in experimental group outperformed students in control group on mathematical CT ability based on their IMA (high, medium, low) in learning Real Analysis. In addition, based on medium IMA the improvement of mathematical CT ability of students who were exposed to RTL model with informal argument was greater than that of students who were exposed to CI (conventional instruction). There was also no effect of interaction between RTL model and CI model with both (high, medium, and low) IMA increased mathematical CT ability. Finally, based on (high, medium, and low) IMA there was a significant improvement in the achievement of all indicators of mathematical CT ability of students who were exposed to RTL model with informal argument than that of students who were exposed to CI.

  17. Control group design: enhancing rigor in research of mind-body therapies for depression.

    Science.gov (United States)

    Kinser, Patricia Anne; Robins, Jo Lynne

    2013-01-01

    Although a growing body of research suggests that mind-body therapies may be appropriate to integrate into the treatment of depression, studies consistently lack methodological sophistication particularly in the area of control groups. In order to better understand the relationship between control group selection and methodological rigor, we provide a brief review of the literature on control group design in yoga and tai chi studies for depression, and we discuss challenges we have faced in the design of control groups for our recent clinical trials of these mind-body complementary therapies for women with depression. To address the multiple challenges of research about mind-body therapies, we suggest that researchers should consider 4 key questions: whether the study design matches the research question; whether the control group addresses performance, expectation, and detection bias; whether the control group is ethical, feasible, and attractive; and whether the control group is designed to adequately control for nonspecific intervention effects. Based on these questions, we provide specific recommendations about control group design with the goal of minimizing bias and maximizing validity in future research.

  18. Methodological Challenges in Sustainability Science: A Call for Method Plurality, Procedural Rigor and Longitudinal Research

    Directory of Open Access Journals (Sweden)

    Henrik von Wehrden

    2017-02-01

    Full Text Available Sustainability science encompasses a unique field that is defined through its purpose, the problem it addresses, and its solution-oriented agenda. However, this orientation creates significant methodological challenges. In this discussion paper, we conceptualize sustainability problems as wicked problems to tease out the key challenges that sustainability science is facing if scientists intend to deliver on its solution-oriented agenda. Building on the available literature, we discuss three aspects that demand increased attention for advancing sustainability science: 1 methods with higher diversity and complementarity are needed to increase the chance of deriving solutions to the unique aspects of wicked problems; for instance, mixed methods approaches are potentially better suited to allow for an approximation of solutions, since they cover wider arrays of knowledge; 2 methodologies capable of dealing with wicked problems demand strict procedural and ethical guidelines, in order to ensure their integration potential; for example, learning from solution implementation in different contexts requires increased comparability between research approaches while carefully addressing issues of legitimacy and credibility; and 3 approaches are needed that allow for longitudinal research, since wicked problems are continuous and solutions can only be diagnosed in retrospect; for example, complex dynamics of wicked problems play out across temporal patterns that are not necessarily aligned with the common timeframe of participatory sustainability research. Taken together, we call for plurality in methodologies, emphasizing procedural rigor and the necessity of continuous research to effectively addressing wicked problems as well as methodological challenges in sustainability science.

  19. A TRADITIONAL FALSE PROBLEM: THE RIGORISM OF KANTIAN MORAL AND POLITICAL PHILOSOPHY. THE CASE OF VERACITY

    Directory of Open Access Journals (Sweden)

    MIHAI NOVAC

    2012-05-01

    Full Text Available According to many of its traditional critics, the main weakness in Kantian moral-political philosophy resides in its impossibility of admitting exceptions. In nuce, all these critical positions have converged, despite their reciprocal heterogeneity, in the so called accuse of moral rigorism (unjustly, I would say directed against Kant’s moral and political perspective. As such, basically, I will seek to defend Kant against this type of criticism, by showing that any perspective attempting to evaluate Kant’s ethics on the grounds of its capacity or incapacity to admit exceptions is apriorily doomed to lack of sense, in its two logical alternatives, i.e. either as nonsense (predicating about empty notions, or as tautology (formulating ad hoc definitions and criteria with respect to Kant’s system and then claiming that it does not hold with respect to them. Essentially, I will try to show that Kantian ethics can organically immunize itself epistemologically against any such so called antirigorist criticism.

  20. Rigorous Training of Dogs Leads to High Accuracy in Human Scent Matching-To-Sample Performance.

    Directory of Open Access Journals (Sweden)

    Sophie Marchal

    Full Text Available Human scent identification is based on a matching-to-sample task in which trained dogs are required to compare a scent sample collected from an object found at a crime scene to that of a suspect. Based on dogs' greater olfactory ability to detect and process odours, this method has been used in forensic investigations to identify the odour of a suspect at a crime scene. The excellent reliability and reproducibility of the method largely depend on rigor in dog training. The present study describes the various steps of training that lead to high sensitivity scores, with dogs matching samples with 90% efficiency when the complexity of the scents presented during the task in the sample is similar to that presented in the in lineups, and specificity reaching a ceiling, with no false alarms in human scent matching-to-sample tasks. This high level of accuracy ensures reliable results in judicial human scent identification tests. Also, our data should convince law enforcement authorities to use these results as official forensic evidence when dogs are trained appropriately.

  1. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  2. Rigorous numerical study of strong microwave photon-magnon coupling in all-dielectric magnetic multilayers

    Energy Technology Data Exchange (ETDEWEB)

    Maksymov, Ivan S., E-mail: ivan.maksymov@uwa.edu.au [School of Physics M013, The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia); ARC Centre of Excellence for Nanoscale BioPhotonics, School of Applied Sciences, RMIT University, Melbourne, VIC 3001 (Australia); Hutomo, Jessica; Nam, Donghee; Kostylev, Mikhail [School of Physics M013, The University of Western Australia, 35 Stirling Highway, Crawley, WA 6009 (Australia)

    2015-05-21

    We demonstrate theoretically a ∼350-fold local enhancement of the intensity of the in-plane microwave magnetic field in multilayered structures made from a magneto-insulating yttrium iron garnet (YIG) layer sandwiched between two non-magnetic layers with a high dielectric constant matching that of YIG. The enhancement is predicted for the excitation regime when the microwave magnetic field is induced inside the multilayer by the transducer of a stripline Broadband Ferromagnetic Resonance (BFMR) setup. By means of a rigorous numerical solution of the Landau-Lifshitz-Gilbert equation consistently with the Maxwell's equations, we investigate the magnetisation dynamics in the multilayer. We reveal a strong photon-magnon coupling, which manifests itself as anti-crossing of the ferromagnetic resonance magnon mode supported by the YIG layer and the electromagnetic resonance mode supported by the whole multilayered structure. The frequency of the magnon mode depends on the external static magnetic field, which in our case is applied tangentially to the multilayer in the direction perpendicular to the microwave magnetic field induced by the stripline of the BFMR setup. The frequency of the electromagnetic mode is independent of the static magnetic field. Consequently, the predicted photon-magnon coupling is sensitive to the applied magnetic field and thus can be used in magnetically tuneable metamaterials based on simultaneously negative permittivity and permeability achievable thanks to the YIG layer. We also suggest that the predicted photon-magnon coupling may find applications in microwave quantum information systems.

  3. Rigorous numerical modeling of scattering-type scanning near-field optical microscopy and spectroscopy

    Science.gov (United States)

    Chen, Xinzhong; Lo, Chiu Fan Bowen; Zheng, William; Hu, Hai; Dai, Qing; Liu, Mengkun

    2017-11-01

    Over the last decade, scattering-type scanning near-field optical microscopy and spectroscopy have been widely used in nano-photonics and material research due to their fine spatial resolution and broad spectral range. A number of simplified analytical models have been proposed to quantitatively understand the tip-scattered near-field signal. However, a rigorous interpretation of the experimental results is still lacking at this stage. Numerical modelings, on the other hand, are mostly done by simulating the local electric field slightly above the sample surface, which only qualitatively represents the near-field signal rendered by the tip-sample interaction. In this work, we performed a more comprehensive numerical simulation which is based on realistic experimental parameters and signal extraction procedures. By directly comparing to the experiments as well as other simulation efforts, our methods offer a more accurate quantitative description of the near-field signal, paving the way for future studies of complex systems at the nanoscale.

  4. Rigorous constraints on the matrix elements of the energy–momentum tensor

    Directory of Open Access Journals (Sweden)

    Peter Lowdon

    2017-11-01

    Full Text Available The structure of the matrix elements of the energy–momentum tensor play an important role in determining the properties of the form factors A(q2, B(q2 and C(q2 which appear in the Lorentz covariant decomposition of the matrix elements. In this paper we apply a rigorous frame-independent distributional-matching approach to the matrix elements of the Poincaré generators in order to derive constraints on these form factors as q→0. In contrast to the literature, we explicitly demonstrate that the vanishing of the anomalous gravitomagnetic moment B(0 and the condition A(0=1 are independent of one another, and that these constraints are not related to the specific properties or conservation of the individual Poincaré generators themselves, but are in fact a consequence of the physical on-shell requirement of the states in the matrix elements and the manner in which these states transform under Poincaré transformations.

  5. Rigorous, robust and systematic: Qualitative research and its contribution to burn care. An integrative review.

    Science.gov (United States)

    Kornhaber, Rachel Anne; de Jong, A E E; McLean, L

    2015-12-01

    Qualitative methods are progressively being implemented by researchers for exploration within healthcare. However, there has been a longstanding and wide-ranging debate concerning the relative merits of qualitative research within the health care literature. This integrative review aimed to exam the contribution of qualitative research in burns care and subsequent rehabilitation. Studies were identified using an electronic search strategy using the databases PubMed, Cumulative Index of Nursing and Allied Health Literature (CINAHL), Excerpta Medica database (EMBASE) and Scopus of peer reviewed primary research in English between 2009 to April 2014 using Whittemore and Knafl's integrative review method as a guide for analysis. From the 298 papers identified, 26 research papers met the inclusion criteria. Across all studies there was an average of 22 participants involved in each study with a range of 6-53 participants conducted across 12 nations that focussed on burns prevention, paediatric burns, appropriate acquisition and delivery of burns care, pain and psychosocial implications of burns trauma. Careful and rigorous application of qualitative methodologies promotes and enriches the development of burns knowledge. In particular, the key elements in qualitative methodological process and its publication are critical in disseminating credible and methodologically sound qualitative research. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  6. Rigorous analysis of image force barrier lowering in bounded geometries: application to semiconducting nanowires

    International Nuclear Information System (INIS)

    Calahorra, Yonatan; Mendels, Dan; Epstein, Ariel

    2014-01-01

    Bounded geometries introduce a fundamental problem in calculating the image force barrier lowering of metal-wrapped semiconductor systems. In bounded geometries, the derivation of the barrier lowering requires calculating the reference energy of the system, when the charge is at the geometry center. In the following, we formulate and rigorously solve this problem; this allows combining the image force electrostatic potential with the band diagram of the bounded geometry. The suggested approach is applied to spheres as well as cylinders. Furthermore, although the expressions governing cylindrical systems are complex and can only be evaluated numerically, we present analytical approximations for the solution, which allow easy implementation in calculated band diagrams. The results are further used to calculate the image force barrier lowering of metal-wrapped cylindrical nanowires; calculations show that although the image force potential is stronger than that of planar systems, taking the complete band-structure into account results in a weaker effect of barrier lowering. Moreover, when considering small diameter nanowires, we find that the electrostatic effects of the image force exceed the barrier region, and influence the electronic properties of the nanowire core. This study is of interest to the nanowire community, and in particular for the analysis of nanowire I−V measurements where wrapped or omega-shaped metallic contacts are used. (paper)

  7. Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows

    Science.gov (United States)

    Qi, Di; Majda, Andrew J.

    2018-04-01

    Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.

  8. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors

    Directory of Open Access Journals (Sweden)

    Spiros Pagiatakis

    2009-10-01

    Full Text Available In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times. It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF. It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at −40 °C, −20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  9. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    Science.gov (United States)

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  10. Diffraction-based overlay measurement on dedicated mark using rigorous modeling method

    Science.gov (United States)

    Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang

    2012-03-01

    Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.

  11. Estimation of the convergence order of rigorous coupled-wave analysis for OCD metrology

    Science.gov (United States)

    Ma, Yuan; Liu, Shiyuan; Chen, Xiuguo; Zhang, Chuanwei

    2011-12-01

    In most cases of optical critical dimension (OCD) metrology, when applying rigorous coupled-wave analysis (RCWA) to optical modeling, a high order of Fourier harmonics is usually set up to guarantee the convergence of the final results. However, the total number of floating point operations grows dramatically as the truncation order increases. Therefore, it is critical to choose an appropriate order to obtain high computational efficiency without losing much accuracy in the meantime. In this paper, the convergence order associated with the structural and optical parameters has been estimated through simulation. The results indicate that the convergence order is linear with the period of the sample when fixing the other parameters, both for planar diffraction and conical diffraction. The illuminated wavelength also affects the convergence of a final result. With further investigations concentrated on the ratio of illuminated wavelength to period, it is discovered that the convergence order decreases with the growth of the ratio, and when the ratio is fixed, convergence order jumps slightly, especially in a specific range of wavelength. This characteristic could be applied to estimate the optimum convergence order of given samples to obtain high computational efficiency.

  12. Conformational distributions and proximity relationships in the rigor complex of actin and myosin subfragment-1.

    Science.gov (United States)

    Nyitrai, M; Hild, G; Lukács, A; Bódis, E; Somogyi, B

    2000-01-28

    Cyclic conformational changes in the myosin head are considered essential for muscle contraction. We hereby show that the extension of the fluorescence resonance energy transfer method described originally by Taylor et al. (Taylor, D. L., Reidler, J., Spudich, J. A., and Stryer, L. (1981) J. Cell Biol. 89, 362-367) allows determination of the position of a labeled point outside the actin filament in supramolecular complexes and also characterization of the conformational heterogeneity of an actin-binding protein while considering donor-acceptor distance distributions. Using this method we analyzed proximity relationships between two labeled points of S1 and the actin filament in the acto-S1 rigor complex. The donor (N-[[(iodoacetyl)amino]ethyl]-5-naphthylamine-1-sulfonate) was attached to either the catalytic domain (Cys-707) or the essential light chain (Cys-177) of S1, whereas the acceptor (5-(iodoacetamido)fluorescein) was attached to the actin filament (Cys-374). In contrast to the narrow positional distribution (assumed as being Gaussian) of Cys-707 (5 +/- 3 A), the positional distribution of Cys-177 was found to be broad (102 +/- 4 A). Such a broad positional distribution of the label on the essential light chain of S1 may be important in accommodating the helically arranged acto-myosin binding relative to the filament axis.

  13. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    OpenAIRE

    K. Di; Y. Liu; B. Liu; M. Peng

    2012-01-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D c...

  14. Dynamics of the standard model

    CERN Document Server

    Donoghue, John F; Holstein, Barry R

    2014-01-01

    Describing the fundamental theory of particle physics and its applications, this book provides a detailed account of the Standard Model, focusing on techniques that can produce information about real observed phenomena. The book begins with a pedagogic account of the Standard Model, introducing essential techniques such as effective field theory and path integral methods. It then focuses on the use of the Standard Model in the calculation of physical properties of particles. Rigorous methods are emphasized, but other useful models are also described. This second edition has been updated to include recent theoretical and experimental advances, such as the discovery of the Higgs boson. A new chapter is devoted to the theoretical and experimental understanding of neutrinos, and major advances in CP violation and electroweak physics have been given a modern treatment. This book is valuable to graduate students and researchers in particle physics, nuclear physics and related fields.

  15. MUSiC - Model-independent search for deviations from Standard Model predictions in CMS

    Science.gov (United States)

    Pieta, Holger

    2010-02-01

    We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )

  16. Premise for Standardized Sepsis Models.

    Science.gov (United States)

    Remick, Daniel G; Ayala, Alfred; Chaudry, Irshad; Coopersmith, Craig M; Deutschman, Clifford; Hellman, Judith; Moldawer, Lyle; Osuchowski, Marcin

    2018-06-05

    Sepsis morbidity and mortality exacts a toll on patients and contributes significantly to healthcare costs. Preclinical models of sepsis have been used to study disease pathogenesis and test new therapies, but divergent outcomes have been observed with the same treatment even when using the same sepsis model. Other disorders such as diabetes, cancer, malaria, obesity and cardiovascular diseases have used standardized, preclinical models that allow laboratories to compare results. Standardized models accelerate the pace of research and such models have been used to test new therapies or changes in treatment guidelines. The National Institutes of Health (NIH) mandated that investigators increase data reproducibility and the rigor of scientific experiments and has also issued research funding announcements about the development and refinement of standardized models. Our premise is that refinement and standardization of preclinical sepsis models may accelerate the development and testing of potential therapeutics for human sepsis, as has been the case with preclinical models for other disorders. As a first step towards creating standardized models, we suggest 1) standardizing the technical standards of the widely used cecal ligation and puncture model and 2) creating a list of appropriate organ injury and immune dysfunction parameters. Standardized sepsis models could enhance reproducibility and allow comparison of results between laboratories and may accelerate our understanding of the pathogenesis of sepsis.

  17. Moisture content measurement in paddy

    Science.gov (United States)

    Klomklao, P.; Kuntinugunetanon, S.; Wongkokua, W.

    2017-09-01

    Moisture content is an important quantity for agriculture product, especially in paddy. In principle, the moisture content can be measured by a gravimetric method which is a direct method. However, the gravimetric method is time-consuming. There are indirect methods such as resistance and capacitance methods. In this work, we developed an indirect method based on a 555 integrated circuit timer. The moisture content sensor was capacitive parallel plates using the dielectric constant property of the moisture. The instrument generated the output frequency that depended on the capacitance of the sensor. We fitted a linear relation between periods and moisture contents. The measurement results have a standard uncertainty of 1.23 % of the moisture content in the range of 14 % to 20 %.

  18. ['Gold standard', not 'golden standard'

    NARCIS (Netherlands)

    Claassen, J.A.H.R.

    2005-01-01

    In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same

  19. Robust Trypsin Coating on Electrospun Polymer Nanofibers in Rigorous Conditions and Its Uses for Protein Digestion

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Hye-Kyung; Kim, Byoung Chan; Jun, Seung-Hyun; Chang, Mun Seock; Lopez-Ferrer, Daniel; Smith, Richard D.; Gu, Man Bock; Lee, Sang-Won; Kim, Beom S.; Kim, Jungbae

    2010-12-15

    An efficient protein digestion in proteomic analysis requires the stabilization of proteases such as trypsin. In the present work, trypsin was stabilized in the form of enzyme coating on electrospun polymer nanofibers (EC-TR), which crosslinks additional trypsin molecules onto covalently-attached trypsin (CA-TR). EC-TR showed better stability than CA-TR in rigorous conditions, such as at high temperatures of 40 °C and 50 °C, in the presence of organic co-solvents, and at various pH's. For example, the half-lives of CA-TR and EC-TR were 0.24 and 163.20 hours at 40 ºC, respectively. The improved stability of EC-TR can be explained by covalent-linkages on the surface of trypsin molecules, which effectively inhibits the denaturation, autolysis, and leaching of trypsin. The protein digestion was performed at 40 °C by using both CA-TR and EC-TR in digesting a model protein, enolase. EC-TR showed better performance and stability than CA-TR by maintaining good performance of enolase digestion under recycled uses for a period of one week. In the same condition, CA-TR showed poor performance from the beginning, and could not be used for digestion at all after a few usages. The enzyme coating approach is anticipated to be successfully employed not only for protein digestion in proteomic analysis, but also for various other fields where the poor enzyme stability presently hampers the practical applications of enzymes.

  20. Rigorous construction and Hadamard property of the Unruh state in Schwarzschild spacetime

    International Nuclear Information System (INIS)

    Dappiaggi, Claudio; Pinamonti, Nicola

    2009-07-01

    The discovery of the radiation properties of black holes prompted the search for a natural candidate quantum ground state for a massless scalar field theory on Schwarzschild spacetime, here considered in the Eddington-Finkelstein representation. Among the several available proposals in the literature, an important physical role is played by the so-called Unruh state which is supposed to be appropriate to capture the physics of a black hole formed by spherically symmetric collapsing matter. Within this respect, we shall consider a massless Klein-Gordon field and we shall rigorously and globally construct such state, that is on the algebra of Weyl observables localised in the union of the static external region, the future event horizon and the non-static black hole region. Eventually, out of a careful use of microlocal techniques, we prove that the built state fulfils, where defined, the so-called Hadamard condition; hence, it is perturbatively stable, in other words realizing the natural candidate with which one could study purely quantum phenomena such as the role of the back reaction of Hawking's radiation. From a geometrical point of view, we shall make a profitable use of a bulk-to-boundary reconstruction technique which carefully exploits the Killing horizon structure as well as the conformal asymptotic behaviour of the underlying background. From an analytical point of view, our tools will range from Hoermander's theorem on propagation of singularities, results on the role of passive states, and a detailed use of the recently discovered peeling behaviour of the solutions of the wave equation in Schwarzschild spacetime. (orig.)

  1. Rigor mortis at the myocardium investigated by post-mortem magnetic resonance imaging.

    Science.gov (United States)

    Bonzon, Jérôme; Schön, Corinna A; Schwendener, Nicole; Zech, Wolf-Dieter; Kara, Levent; Persson, Anders; Jackowski, Christian

    2015-12-01

    Post-mortem cardiac MR exams present with different contraction appearances of the left ventricle in cardiac short axis images. It was hypothesized that the grade of post-mortem contraction may be related to the post-mortem interval (PMI) or cause of death and a phenomenon caused by internal rigor mortis that may give further insights in the circumstances of death. The cardiac contraction grade was investigated in 71 post-mortem cardiac MR exams (mean age at death 52 y, range 12-89 y; 48 males, 23 females). In cardiac short axis images the left ventricular lumen volume as well as the left ventricular myocardial volume were assessed by manual segmentation. The quotient of both (LVQ) represents the grade of myocardial contraction. LVQ was correlated to the PMI, sex, age, cardiac weight, body mass and height, cause of death and pericardial tamponade when present. In cardiac causes of death a separate correlation was investigated for acute myocardial infarction cases and arrhythmic deaths. LVQ values ranged from 1.99 (maximum dilatation) to 42.91 (maximum contraction) with a mean of 15.13. LVQ decreased slightly with increasing PMI, however without significant correlation. Pericardial tamponade positively correlated with higher LVQ values. Variables such as sex, age, body mass and height, cardiac weight and cause of death did not correlate with LVQ values. There was no difference in LVQ values for myocardial infarction without tamponade and arrhythmic deaths. Based on the observation in our investigated cases, the phenomenon of post-mortem myocardial contraction cannot be explained by the influence of the investigated variables, except for pericardial tamponade cases. Further research addressing post-mortem myocardial contraction has to focus on other, less obvious factors, which may influence the early post-mortem phase too. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. A Generic Model for Relative Adjustment Between Optical Sensors Using Rigorous Orbit Mechanics

    Directory of Open Access Journals (Sweden)

    B. Islam

    2008-06-01

    Full Text Available The classical calibration or space resection is the fundamental task in photogrammetry. The lack of sufficient knowledge of interior and exterior orientation parameters lead to unreliable results in the photogrammetric process. One of the earliest in approaches using in photogrammetry was the plumb line calibration method. This method is suitable to recover the radial and decentering lens distortion coefficients, while the remaining interior(focal length and principal point coordinates and exterior orientation parameters have to be determined by a complimentary method. As the lens distortion remains very less it not considered as the interior orientation parameters, in the present rigorous sensor model. There are several other available methods based on the photogrammetric collinearity equations, which consider the determination of exterior orientation parameters, with no mention to the simultaneous determination of inner orientation parameters. Normal space resection methods solve the problem using control points, whose coordinates are known both in image and object reference systems. The non-linearity of the model and the problems, in point location in digital images and identifying the maximum GPS measured control points are the main drawbacks of the classical approaches. This paper addresses mathematical model based on the fundamental assumption of collineariy of three points of two Along-Track Stereo imagery sensors and independent object point. Assuming this condition it is possible to extract the exterior orientation (EO parameters for a long strip and single image together, without and with using the control points. Moreover, after extracting the EO parameters the accuracy for satellite data products are compared in with using single and with no control points.

  3. Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications

    Directory of Open Access Journals (Sweden)

    Vassilis Gikas

    2016-08-01

    Full Text Available With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS, Inertial Measurement Unit (IMU and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness capabilities (i.e., potential and limitations based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android and iPhone 5s (iOS. Our findings indicate that the deviation of the smartphone locations from ground truth (trueness deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of

  4. Analysis of specular resonance in dielectric bispheres using rigorous and geometrical-optics theories.

    Science.gov (United States)

    Miyazaki, Hideki T; Miyazaki, Hiroshi; Miyano, Kenjiro

    2003-09-01

    We have recently identified the resonant scattering from dielectric bispheres in the specular direction, which has long been known as the specular resonance, to be a type of rainbow (a caustic) and a general phenomenon for bispheres. We discuss the details of the specular resonance on the basis of systematic calculations. In addition to the rigorous theory, which precisely describes the scattering even in the resonance regime, the ray-tracing method, which gives the scattering in the geometrical-optics limit, is used. Specular resonance is explicitly defined as strong scattering in the direction of the specular reflection from the symmetrical axis of the bisphere whose intensity exceeds that of the scattering from noninteracting bispheres. Then the range of parameters for computing a particular specular resonance is specified. This resonance becomes prominent in a wide range of refractive indices (from 1.2 to 2.2) in a wide range of size parameters (from five to infinity) and for an arbitrarily polarized light incident within an angle of 40 degrees to the symmetrical axis. This particular scattering can stay evident even when the spheres are not in contact or the sizes of the spheres are different. Thus specular resonance is a common and robust phenomenon in dielectric bispheres. Furthermore, we demonstrate that various characteristic features in the scattering from bispheres can be explained successfully by using intuitive and simple representations. Most of the significant scatterings other than the specular resonance are also understandable as caustics in geometrical-optics theory. The specular resonance becomes striking at the smallest size parameter among these caustics because its optical trajectory is composed of only the refractions at the surfaces and has an exceptionally large intensity. However, some characteristics are not accounted for by geometrical optics. In particular, the oscillatory behaviors of their scattering intensity are well described by

  5. Unforgivable Sinners? Epistemological and Psychological Naturalism in Husserl’s Philosophy as a Rigorous Science

    Directory of Open Access Journals (Sweden)

    Andrea Sebastiano Staiti

    2012-01-01

    Full Text Available In this paper I present and assess Husserl's arguments against epistomological and psychological naturalism in his essay Philosophy as a Rigorous Science. I show that his critique is directed against positions that are generally more extreme than most currently debated variants of naturalism. However, Husserl has interesting thoughts to contribute to philosophy today. First, he shows that there is an important connection between naturalism in epistemology (which in his view amounts to the position that the validity of logic can be reduced to the validity natural laws of thinking and naturalism in psychology (which in his view amounts to the position that all psychic occurrences are merely parallel accompaniments of physiological occurrences. Second, he shows that a strong version of epistemological naturalism is self-undermining and fails to translate the cogency of logic in psychological terms. Third, and most importantly for current debates, he attacks Cartesianism as a form of psychological naturalism because of its construal of the psyche as a substance. Against this position, Husserl defends the necessity to formulate new epistemic aims for the investigation of consciousness. He contends that what is most interesting about consciousness is not its empirical fact but its transcendental function of granting cognitive access to all kinds of objects (both empirical and ideal. The study of this function requires a specific method (eidetics that cannot be conflated with empirical methods. I conclude that Husserl's analyses offer much-needed insight into the fabric of consciousness and compelling arguments against unwarranted metaphysical speculations about the relationship between mind and body.

  6. A Development of Advanced Rigorous 2 Step System for the High Resolution Residual Dose Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Hyun; Kim, Jong Woo; Kim, Jea Hyun; Lee, Jae Yong; Shin, Chang Ho [Hanyang Univ., Seoul (Korea, Republic of); Kim, Song Hyun [Kyoto University, Sennan (Japan)

    2016-10-15

    In these days, an activation problem such as residual radiation is one of the important issues. The activated devices and structures can emit the residual radiation. Therefore, the activation should be properly analyzed to make a plan for design, operation, and decontamination of nuclear facilities. For activation calculation, Rigorous 2 Step (R2S) method is introduced as following strategy: (1) the particle transport calculation is performed for an object geometry to get particle spectra and total fluxes; (2) inventories of each cell are calculated by using flux information according to irradiation and decay history; (3) the residual gamma distribution was evaluated by transport code, if needed. This scheme is based on cell calculation of used geometry. In this method, the particle spectra and total fluxes are obtained by mesh tally for activation calculation. It is useful to reduce the effects of gradient flux information. Nevertheless, several limitations are known as follows: Firstly, high relative error of spectra, when lots of meshes were used; secondly, different flux information from spectrum of void in mesh-tally. To calculate high resolution residual dose, several method are developed such as R2Smesh and MCR2S unstructured mesh. The R2Smesh method products better efficiency for obtaining neutron spectra by using fine/coarse mesh. Also, the MCR2S unstructured mesh can effectively separate void spectrum. In this study, the AR2S system was developed to combine the features of those mesh based R2S method. To confirm the AR2S system, the simple activation problem was evaluated and compared with R2S method using same division. Those results have good agreement within 0.83 %. Therefore, it is expected that the AR2S system can properly estimate an activation problem.

  7. Rigorous construction and Hadamard property of the Unruh state in Schwarzschild spacetime

    Energy Technology Data Exchange (ETDEWEB)

    Dappiaggi, Claudio; Pinamonti, Nicola [Hamburg Univ. (Germany). II. Inst. fuer Theoretische Physik; Moretti, Valter [Trento Univ., Povo (Italy). Dipt. di Matematica; Istituto Nazionale di Fisica Nucleare, Povo (Italy); Istituto Nazionale di Alta Matematica ' ' F. Severi' ' , GNFM, Sesto Fiorentino (Italy)

    2009-07-15

    The discovery of the radiation properties of black holes prompted the search for a natural candidate quantum ground state for a massless scalar field theory on Schwarzschild spacetime, here considered in the Eddington-Finkelstein representation. Among the several available proposals in the literature, an important physical role is played by the so-called Unruh state which is supposed to be appropriate to capture the physics of a black hole formed by spherically symmetric collapsing matter. Within this respect, we shall consider a massless Klein-Gordon field and we shall rigorously and globally construct such state, that is on the algebra of Weyl observables localised in the union of the static external region, the future event horizon and the non-static black hole region. Eventually, out of a careful use of microlocal techniques, we prove that the built state fulfils, where defined, the so-called Hadamard condition; hence, it is perturbatively stable, in other words realizing the natural candidate with which one could study purely quantum phenomena such as the role of the back reaction of Hawking's radiation. From a geometrical point of view, we shall make a profitable use of a bulk-to-boundary reconstruction technique which carefully exploits the Killing horizon structure as well as the conformal asymptotic behaviour of the underlying background. From an analytical point of view, our tools will range from Hoermander's theorem on propagation of singularities, results on the role of passive states, and a detailed use of the recently discovered peeling behaviour of the solutions of the wave equation in Schwarzschild spacetime. (orig.)

  8. Atlantic salmon skin and fillet color changes effected by perimortem handling stress, rigor mortis, and ice storage.

    Science.gov (United States)

    Erikson, U; Misimi, E

    2008-03-01

    The changes in skin and fillet color of anesthetized and exhausted Atlantic salmon were determined immediately after killing, during rigor mortis, and after ice storage for 7 d. Skin color (CIE L*, a*, b*, and related values) was determined by a Minolta Chroma Meter. Roche SalmoFan Lineal and Roche Color Card values were determined by a computer vision method and a sensory panel. Before color assessment, the stress levels of the 2 fish groups were characterized in terms of white muscle parameters (pH, rigor mortis, and core temperature). The results showed that perimortem handling stress initially significantly affected several color parameters of skin and fillets. Significant transient fillet color changes also occurred in the prerigor phase and during the development of rigor mortis. Our results suggested that fillet color was affected by postmortem glycolysis (pH drop, particularly in anesthetized fillets), then by onset and development of rigor mortis. The color change patterns during storage were different for the 2 groups of fish. The computer vision method was considered suitable for automated (online) quality control and grading of salmonid fillets according to color.

  9. Rigorous Line-Based Transformation Model Using the Generalized Point Strategy for the Rectification of High Resolution Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Kun Hu

    2016-09-01

    Full Text Available High precision geometric rectification of High Resolution Satellite Imagery (HRSI is the basis of digital mapping and Three-Dimensional (3D modeling. Taking advantage of line features as basic geometric control conditions instead of control points, the Line-Based Transformation Model (LBTM provides a practical and efficient way of image rectification. It is competent to build the mathematical relationship between image space and the corresponding object space accurately, while it reduces the workloads of ground control and feature recognition dramatically. Based on generalization and the analysis of existing LBTMs, a novel rigorous LBTM is proposed in this paper, which can further eliminate the geometric deformation caused by sensor inclination and terrain variation. This improved nonlinear LBTM is constructed based on a generalized point strategy and resolved by least squares overall adjustment. Geo-positioning accuracy experiments with IKONOS, GeoEye-1 and ZiYuan-3 satellite imagery are performed to compare rigorous LBTM with other relevant line-based and point-based transformation models. Both theoretic analysis and experimental results demonstrate that the rigorous LBTM is more accurate and reliable without adding extra ground control. The geo-positioning accuracy of satellite imagery rectified by rigorous LBTM can reach about one pixel with eight control lines and can be further improved by optimizing the horizontal and vertical distribution of control lines.

  10. Useful, Used, and Peer Approved: The Importance of Rigor and Accessibility in Postsecondary Research and Evaluation. WISCAPE Viewpoints

    Science.gov (United States)

    Vaade, Elizabeth; McCready, Bo

    2012-01-01

    Traditionally, researchers, policymakers, and practitioners have perceived a tension between rigor and accessibility in quantitative research and evaluation in postsecondary education. However, this study indicates that both producers and consumers of these studies value high-quality work and clear findings that can reach multiple audiences. The…

  11. Decommissioning standards

    International Nuclear Information System (INIS)

    Crofford, W.N.

    1980-01-01

    EPA has agreed to establish a series of environmental standards for the safe disposal of radioactive waste through participation in the Interagency Review Group on Nuclear Waste Management (IRG). One of the standards required under the IRG is the standard for decommissioning of radioactive contaminated sites, facilities, and materials. This standard is to be proposed by December 1980 and promulgated by December 1981. Several considerations are important in establishing these standards. This study includes discussions of some of these considerations and attempts to evaluate their relative importance. Items covered include: the form of the standards, timing for decommissioning, occupational radiation protection, costs and financial provisions. 4 refs

  12. Glycolysis and ATP degradation in cod ( Gadus morhua ) at subzero temperatures in relation to thaw rigor

    DEFF Research Database (Denmark)

    Cappeln, Gertrud; Jessen, Flemming

    2001-01-01

    Glycolysis was shown to occur during freezing of cod of decrease in glycogen and an increase in lactate. In addition, the ATP content decreased during freezing. Synthesis of ATP was measured as degradation of glycogen. During storage at -9 and - 12 degreesC it was found that degradation of ATP...

  13. Rigor or Restriction: Examining Close Reading with High School English Language Learners

    Science.gov (United States)

    Thomason, Betty; Brown, Clara Lee; Ward, Natalia

    2017-01-01

    English language learners (ELLs) are the fastest growing student subgroup in the United States, and public schools have the challenging task of teaching ELLs both English language and academic content. In spite of the attention given to improving outcomes for ELLs, the achievement gap between ELLs and native English speakers persists, especially…

  14. Accounting standards

    NARCIS (Netherlands)

    Stellinga, B.; Mügge, D.

    2014-01-01

    The European and global regulation of accounting standards have witnessed remarkable changes over the past twenty years. In the early 1990s, EU accounting practices were fragmented along national lines and US accounting standards were the de facto global standards. Since 2005, all EU listed

  15. Standardization Documents

    Science.gov (United States)

    2011-08-01

    Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK

  16. Evaluation of Acid Digestion Procedures to Estimate Mineral Contents in Materials from Animal Trials

    Directory of Open Access Journals (Sweden)

    M. N. N. Palma

    2015-11-01

    Full Text Available Rigorously standardized laboratory protocols are essential for meaningful comparison of data from multiple sites. Considering that interactions of minerals with organic matrices may vary depending on the material nature, there could be peculiar demands for each material with respect to digestion procedure. Acid digestion procedures were evaluated using different nitric to perchloric acid ratios and one- or two-step digestion to estimate the concentration of calcium, phosphorus, magnesium, and zinc in samples of carcass, bone, excreta, concentrate, forage, and feces. Six procedures were evaluated: ratio of nitric to perchloric acid at 2:1, 3:1, and 4:1 v/v in a one- or two-step digestion. There were no direct or interaction effects (p>0.01 of nitric to perchloric acid ratio or number of digestion steps on magnesium and zinc contents. Calcium and phosphorus contents presented a significant (p0.01 calcium or phosphorus contents in carcass, excreta, concentrate, forage, and feces. Number of digestion steps did not affect mineral content (p>0.01. Estimated concentration of calcium, phosphorus, magnesium, and zinc in carcass, excreta, concentrated, forage, and feces samples can be performed using digestion solution of nitric to perchloric acid 4:1 v/v in a one-step digestion. However, samples of bones demand a stronger digestion solution to analyze the mineral contents, which is represented by an increased proportion of perchloric acid, being recommended a digestion solution of nitric to perchloric acid 2:1 v/v in a one-step digestion.

  17. Rigorous study of the gap equation for an inhomogeneous superconducting state near T/sub c/

    International Nuclear Information System (INIS)

    Hu, C.

    1975-01-01

    A rigorous analytic study of the self-consistent gap equation (symobolically Δ=F/sub T/Δ), for an inhomogeneous superconducting state, is presented in the Bogoliubov formulation. The gap function Δ (r) is taken to simulate a planar normal-superconducting phase boundary: Δ (r) =Δ/sub infinity/ tanh(αΔ/sub infinity/z/v/sub F/) THETA (z), where Δ/sub infinity/(T) is the equilibrium gap, v/subF/ is the Fermi velocity, and THETA (z) is a unit step function. First a special space integral of the gap equation proportional∫ 0 /sub +//sup infinity/(F/sub T/-Δ)(dΔ/dz) dz is evaluated essentially exactly, except for a nonperturbative WKBJ approximation used in solving the Bogoliubov--de Gennes equations. It is then expanded near the transition temperature T/sub c/ in power of Δ/sub infinity/proportional (1-T/T/sub c/) 1 / 2 , demonstrating an exact cancellation of a subseries of ''anomalous-order'' terms. The leading surviving term is found to agree in order, but not in magnitude, with the Ginzburg-Landau-Gor'kov (GLG) approximation. The discrepancy is found to be linked to the slope discontinuity in our chosen Δ. A contour-integral technique in a complex-energy plane is then devised to evaluate the local value of F/sub T/-Δ exactly. Our result reveals that near T/sub c/ this method can reproduce the GLG result essentially everywhere, except within a BCS coherence length not xi (T) exclamation from a singularity in Δ, where F/sub T/-Δ can have a singular contribution with an ''anomalous'' local magnitude, not expected from the GLG approach. This anomalous term precisely accounts for the discrepancy found in the special integral of the gap equation as mentioned above, and likely explains the ultimate origin of the anomalous terms found in the free energy of an isolated vortex line by Cleary

  18. Ar-Ar_Redux: rigorous error propagation of 40Ar/39Ar data, including covariances

    Science.gov (United States)

    Vermeesch, P.

    2015-12-01

    Rigorous data reduction and error propagation algorithms are needed to realise Earthtime's objective to improve the interlaboratory accuracy of 40Ar/39Ar dating to better than 1% and thereby facilitate the comparison and combination of the K-Ar and U-Pb chronometers. Ar-Ar_Redux is a new data reduction protocol and software program for 40Ar/39Ar geochronology which takes into account two previously underappreciated aspects of the method: 1. 40Ar/39Ar measurements are compositional dataIn its simplest form, the 40Ar/39Ar age equation can be written as: t = log(1+J [40Ar/39Ar-298.5636Ar/39Ar])/λ = log(1 + JR)/λ Where λ is the 40K decay constant and J is the irradiation parameter. The age t does not depend on the absolute abundances of the three argon isotopes but only on their relative ratios. Thus, the 36Ar, 39Ar and 40Ar abundances can be normalised to unity and plotted on a ternary diagram or 'simplex'. Argon isotopic data are therefore subject to the peculiar mathematics of 'compositional data', sensu Aitchison (1986, The Statistical Analysis of Compositional Data, Chapman & Hall). 2. Correlated errors are pervasive throughout the 40Ar/39Ar methodCurrent data reduction protocols for 40Ar/39Ar geochronology propagate the age uncertainty as follows: σ2(t) = [J2 σ2(R) + R2 σ2(J)] / [λ2 (1 + R J)], which implies zero covariance between R and J. In reality, however, significant error correlations are found in every step of the 40Ar/39Ar data acquisition and processing, in both single and multi collector instruments, during blank, interference and decay corrections, age calculation etc. Ar-Ar_Redux revisits every aspect of the 40Ar/39Ar method by casting the raw mass spectrometer data into a contingency table of logratios, which automatically keeps track of all covariances in a compositional context. Application of the method to real data reveals strong correlations (r2 of up to 0.9) between age measurements within a single irradiation batch. Propertly taking

  19. Peridynamics as a rigorous coarse-graining of atomistics for multiscale materials design

    International Nuclear Information System (INIS)

    Lehoucq, Richard B.; Aidun, John Bahram; Silling, Stewart Andrew; Sears, Mark P.; Kamm, James R.; Parks, Michael L.

    2010-01-01

    This report summarizes activities undertaken during FY08-FY10 for the LDRD Peridynamics as a Rigorous Coarse-Graining of Atomistics for Multiscale Materials Design. The goal of our project was to develop a coarse-graining of finite temperature molecular dynamics (MD) that successfully transitions from statistical mechanics to continuum mechanics. The goal of our project is to develop a coarse-graining of finite temperature molecular dynamics (MD) that successfully transitions from statistical mechanics to continuum mechanics. Our coarse-graining overcomes the intrinsic limitation of coupling atomistics with classical continuum mechanics via the FEM (finite element method), SPH (smoothed particle hydrodynamics), or MPM (material point method); namely, that classical continuum mechanics assumes a local force interaction that is incompatible with the nonlocal force model of atomistic methods. Therefore FEM, SPH, and MPM inherit this limitation. This seemingly innocuous dichotomy has far reaching consequences; for example, classical continuum mechanics cannot resolve the short wavelength behavior associated with atomistics. Other consequences include spurious forces, invalid phonon dispersion relationships, and irreconcilable descriptions/treatments of temperature. We propose a statistically based coarse-graining of atomistics via peridynamics and so develop a first of a kind mesoscopic capability to enable consistent, thermodynamically sound, atomistic-to-continuum (AtC) multiscale material simulation. Peridynamics (PD) is a microcontinuum theory that assumes nonlocal forces for describing long-range material interaction. The force interactions occurring at finite distances are naturally accounted for in PD. Moreover, PDs nonlocal force model is entirely consistent with those used by atomistics methods, in stark contrast to classical continuum mechanics. Hence, PD can be employed for mesoscopic phenomena that are beyond the realms of classical continuum mechanics and

  20. Updated Design Standards and Guidance from the What Works Clearinghouse: Regression Discontinuity Designs and Cluster Designs

    Science.gov (United States)

    Cole, Russell; Deke, John; Seftor, Neil

    2016-01-01

    The What Works Clearinghouse (WWC) maintains design standards to identify rigorous, internally valid education research. As education researchers advance new methodologies, the WWC must revise its standards to include an assessment of the new designs. Recently, the WWC has revised standards for two emerging study designs: regression discontinuity…

  1. Leading the Transition from the Alternate Assessment Based on Modified Achievement Standards to the General Assessment

    Science.gov (United States)

    Lazarus, Sheryl S.; Rieke, Rebekah

    2013-01-01

    Schools are facing many changes in the ways that teaching, learning, and assessment take place. Most states are moving from individual state standards to the new Common Core State Standards, which will be fewer, higher, and more rigorous than most current state standards. As the next generation of assessments used for accountability are rolled…

  2. Effects of cut flower shape and content of sugars in flower organs on the longevity of vase life in standard type carnation [Dianthus] and relationships between the longevity and temperature and solar radiation during the growing period

    International Nuclear Information System (INIS)

    Miura, Y.; Ozawa, Y.; Takahashi, M.; Igarashi, D.; Inoue, T.; Uematsu, H.; Imai, K.; Matsuyama, A.; Soga, A.; Yoshida, M.

    2002-01-01

    We obtained carnation cut flowers 'Fransisco' from a greenhouse on the first Monday every month in February to June, and investigated the relationship between vase life and content of sugars in the cut flowers. Correlations between the longevity of vase life and temperatures in the greenhouse and amount of solar radiation during the growing period were also investigated. 1. Fresh weight and stem diameter of the cut flowers were highest in February and March, and decreased from April to June. Mean vase life was shortest in March (4.5 days) and longest in May (5.9 days). 2. Fructose and glucose contents in petals were highest in May (10.0 mg and 6.5 mg ¥ 100 mg -1 DW) and lowest in March (6.0 mg and 4.8 mg ¥ 100 mg -1 DW). 3. The vase life of carnation was highly correlated with day-time or night-time mean temperatures in 20 days to harvest, and it was estimated that the optimum growing temperature for long vase life of the cut flower was around 22°C in day-time and 14°C in night-time. (author)

  3. Content of system design descriptions

    International Nuclear Information System (INIS)

    1998-10-01

    A System Design Description (SDD) describes the requirements and features of a system. This standard provides guidance on the expected technical content of SDDs. The need for such a standard was recognized during efforts to develop SDDs for safety systems at DOE Hazard Category 2 nonreactor nuclear facilities. Existing guidance related to the corresponding documents in other industries is generally not suitable to meet the needs of DOE nuclear facilities. Across the DOE complex, different contractors have guidance documents, but they vary widely from site to site. While such guidance documents are valuable, no single guidance document has all the attributes that DOE considers important, including a reasonable degree of consistency or standardization. This standard is a consolidation of the best of the existing guidance. This standard has been developed with a technical content and level of detail intended to be most applicable to safety systems at DOE Hazard Category 2 nonreactor nuclear facilities. Notwithstanding that primary intent, this standard is recommended for other systems at such facilities, especially those that are important to achieving the programmatic mission of the facility. In addition, application of this standard should be considered for systems at other facilities, including non-nuclear facilities, on the basis that SDDs may be beneficial and cost-effective

  4. A Rigorous, Compositional, and Extensible Framework for Dynamic Fault Tree Analysis

    NARCIS (Netherlands)

    Boudali, H.; Sandhu, R.; Crouzen, Pepijn; Stoelinga, Mariëlle Ida Antoinette

    Fault trees (FT) are among the most prominent formalisms for reliability analysis of technical systems. Dynamic FTs extend FTs with support for expressing dynamic dependencies among components. The standard analysis vehicle for DFTs is state-based, and treats the model as a CTMC, a continuous-time

  5. Critical Thinking and Formative Assessments: Increasing the Rigor in Your Classroom

    Science.gov (United States)

    Moore, Betsy; Stanley, Todd

    2010-01-01

    Develop your students' critical thinking skills and prepare them to perform competitively in the classroom, on state tests, and beyond. In this book, Moore and Stanley show you how to effectively instruct your students to think on higher levels, and how to assess their progress. As states move toward common achievement standards, teachers have…

  6. Effect of pre-rigor stretch and various constant temperatures on the rate of post-mortem pH fall, rigor mortis and some quality traits of excised porcine biceps femoris muscle strips.

    Science.gov (United States)

    Vada-Kovács, M

    1996-01-01

    Porcine biceps femoris strips of 10 cm original length were stretched by 50% and fixed within 1 hr post mortem then subjected to temperatures of 4 °, 15 ° or 36 °C until they attained their ultimate pH. Unrestrained control muscle strips, which were left to shorten freely, were similarly treated. Post-mortem metabolism (pH, R-value) and shortening were recorded; thereafter ultimate meat quality traits (pH, lightness, extraction and swelling of myofibrils) were determined. The rate of pH fall at 36 °C, as well as ATP breakdown at 36 and 4 °C, were significantly reduced by pre-rigor stretch. The relationship between R-value and pH indicated cold shortening at 4 °C. Myofibrils isolated from pre-rigor stretched muscle strips kept at 36 °C showed the most severe reduction of hydration capacity, while paleness remained below extreme values. However, pre-rigor stretched myofibrils - when stored at 4 °C - proved to be superior to shortened ones in their extractability and swelling.

  7. Comparison of 3D two-point Dixon and standard 2D dual-echo breath-hold sequences for detection and quantification of fat content in renal angiomyolipoma

    International Nuclear Information System (INIS)

    Rosenkrantz, Andrew B.; Raj, Sean; Babb, James S.; Chandarana, Hersh

    2012-01-01

    Purpose: To assess the utility of a 3D two-point Dixon sequence with water–fat decomposition for quantification of fat content of renal angiomyolipoma (AML). Methods: 84 patients underwent renal MRI including 2D in-and-opposed-phase (IP and OP) sequence and 3D two-point Dixon sequence that generates four image sets [IP, OP, water-only (WO), and fat-only (FO)] within one breath-hold. Two radiologists reviewed 2D and 3D images during separate sessions to identify fat-containing renal masses measuring at least 1 cm. For identified lesions subsequently confirmed to represent AML, ROIs were placed at matching locations on 2D and 3D images and used to calculate 2D and 3D SI index [(SI IP − SI OP )/SI IP ] and 3D fat fraction (FF) [SI FO /(SI FO + SI WO )]. 2D and 3D SI index were compared with 3D FF using Pearson correlation coefficients. Results: 41 AMLs were identified in 6 patients. While all were identified using the 3D sequence, 39 were identified using the 2D sequence, with the remaining 2 AMLs retrospectively visible on 2D images but measuring under 1 cm. Among 32 AMLs with a 3D FF of over 50%, both 2D and 3D SI index showed a statistically significant inverse correlation with 3D FF (2D SI index : r = −0.63, p = 0.0010; 3D SI index : r = −0.97, p index , is not limited by ambiguity of water or fat dominance. This may assist clinical management of AML given evidence that fat content predicts embolization response.

  8. An exploration of administrators' perceptions of elementary science: A case study of the role of science in two elementary schools based on the interactions of administrators with colleagues, science content and state standards

    Science.gov (United States)

    Brogdon, Lori-Anne Stelmark

    This research is a case study on the perceptions and attitudes of administrators in the area of elementary science and how their responses reflect agreement or dissonance with the perceptions of elementary teachers on the subject of science within the same district. The study used Likert-type surveys and interviews from both administrators and teachers on five key areas: 1) Attitudes towards science and teaching 2) Attitudes towards teaching science 3) Attitudes towards administrators 4) Time teaching science and 5) Attitudes about policy and standards. Survey data was analyzed within and across areas to identify similarity and difference within each group. The medians from the administrative and teacher surveys were then crossed referenced through the use of a Mann Whitney test to identify areas of similarity. Interview data was coded around three major themes: 1) Standards 2) Classroom Instruction and 3) Conversations. The findings show that even though administrators' perceptions favor the inclusion of science in the elementary classroom, both administrators and teachers in this study reported limited involvement from, and conversation with, each other on the topic of science education. Heavy reliance by the administrators was placed on the use of consultants to provide professional development in the area of science instruction and to review the use of state standards, resulting in limited conversation between administrators and teachers about science. Teachers reported a heavy reliance upon their colleagues in the area of science instruction and curriculum planning. In addition, both administrators and teachers reported a greater focus on math and English for classroom instruction. Findings in this research support implications that more focus should be placed on the role of administrators in the implementation of science instruction. Administrators can play a crucial role in the success of science programs at the building, district and state levels

  9. Multimedia content classification metrics for content adaptation

    OpenAIRE

    Fernandes, Rui; Andrade, M.T.

    2015-01-01

    Multimedia content consumption is very popular nowadays. However, not every content can be consumed in its original format: the combination of content, transport and access networks, consumption device and usage environment characteristics may all pose restrictions to that purpose. One way to provide the best possible quality to the user is to adapt the content according to these restrictions as well as user preferences. This adaptation stage can be best executed if knowledge about the conten...

  10. Multimedia content classification metrics for content adaptation

    OpenAIRE

    Fernandes, Rui; Andrade, M.T.

    2016-01-01

    Multimedia content consumption is very popular nowadays. However, not every content can be consumed in its original format: the combination of content, transport and access networks, consumption device and usage environment characteristics may all pose restrictions to that purpose. One way to provide the best possible quality to the user is to adapt the content according to these restrictions as well as user preferences. This adaptation stage can be best executed if knowledge about the conten...

  11. Communications standards

    CERN Document Server

    Stokes, A V

    1986-01-01

    Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se

  12. Reconsideration of the sequence of rigor mortis through postmortem changes in adenosine nucleotides and lactic acid in different rat muscles.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Iwadate, K; Nakajima, M

    1996-10-25

    We examined the changes in adenosine triphosphate (ATP), lactic acid, adenosine diphosphate (ADP) and adenosine monophosphate (AMP) in five different rat muscles after death. Rigor mortis has been thought to occur simultaneously in dead muscles and hence to start in small muscles sooner than in large muscles. In this study we found that the rate of decrease in ATP was significantly different in each muscle. The greatest drop in ATP was observed in the masseter muscle. These findings contradict the conventional theory of rigor mortis. Similarly, the rates of change in ADP and lactic acid, which are thought to be related to the consumption or production of ATP, were different in each muscle. However, the rate of change of AMP was the same in each muscle.

  13. Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor

    Science.gov (United States)

    Nathues, Christina; Würbel, Hanno

    2016-01-01

    animal experiments are lacking important information about experimental conduct that determines the scientific validity of the findings, which may be critical for the weight attributed to the benefit of the research in the harm–benefit analysis. Similar to manuscripts getting accepted for publication despite poor reporting of measures against bias, applications for animal experiments may often be approved based on implicit confidence rather than explicit evidence of scientific rigor. Our findings shed serious doubt on the current authorization procedure for animal experiments, as well as the peer-review process for scientific publications, which in the long run may undermine the credibility of research. Developing existing authorization procedures that are already in place in many countries towards a preregistration system for animal research is one promising way to reform the system. This would not only benefit the scientific validity of findings from animal experiments but also help to avoid unnecessary harm to animals for inconclusive research. PMID:27911892

  14. Addressing Three Common Myths about the Next Generation Science Standards

    Science.gov (United States)

    Huff, Kenneth L.

    2016-01-01

    Although the "Next Generation Science Standards" (NGSS Lead States 2013) were released over two years ago, misconceptions about what they are--and are not--persist. The "NGSS" provide for consistent science education opportunities for all students--regardless of demographics--with a level of rigor expected in every location and…

  15. The rigorous stochastic matrix multiplication scheme for the calculations of reduced equilibrium density matrices of open multilevel quantum systems

    International Nuclear Information System (INIS)

    Chen, Xin

    2014-01-01

    Understanding the roles of the temporary and spatial structures of quantum functional noise in open multilevel quantum molecular systems attracts a lot of theoretical interests. I want to establish a rigorous and general framework for functional quantum noises from the constructive and computational perspectives, i.e., how to generate the random trajectories to reproduce the kernel and path ordering of the influence functional with effective Monte Carlo methods for arbitrary spectral densities. This construction approach aims to unify the existing stochastic models to rigorously describe the temporary and spatial structure of Gaussian quantum noises. In this paper, I review the Euclidean imaginary time influence functional and propose the stochastic matrix multiplication scheme to calculate reduced equilibrium density matrices (REDM). In addition, I review and discuss the Feynman-Vernon influence functional according to the Gaussian quadratic integral, particularly its imaginary part which is critical to the rigorous description of the quantum detailed balance. As a result, I establish the conditions under which the influence functional can be interpreted as the average of exponential functional operator over real-valued Gaussian processes for open multilevel quantum systems. I also show the difference between the local and nonlocal phonons within this framework. With the stochastic matrix multiplication scheme, I compare the normalized REDM with the Boltzmann equilibrium distribution for open multilevel quantum systems

  16. Effects of well-boat transportation on the muscle pH and onset of rigor mortis in Atlantic salmon.

    Science.gov (United States)

    Gatica, M C; Monti, G; Gallo, C; Knowles, T G; Warriss, P D

    2008-07-26

    During the transport of salmon (Salmo salar), in a well-boat, 10 fish were sampled at each of six stages: in cages after crowding at the farm (stage 1), in the well-boat after loading (stage 2), in the well-boat after eight hours transport and before unloading (stage 3), in the resting cages immediately after finishing unloading (stage 4), after 24 hours resting in cages, (stage 5) and in the processing plant after pumping from the resting cages (stage 6). The water in the well-boat was at ambient temperature with recirculation to the sea. At each stage the fish were stunned percussively and bled by gill cutting. Immediately after death, and then every three hours for 18 hours, the muscle pH and rigor index of the fish were measured. At successive stages the initial muscle pH of the fish decreased, except for a slight gain in stage 5, after they had been rested for 24 hours. The lowest initial muscle pH was observed at stage 6. The fishes' rigor index showed that rigor developed more quickly at each successive stage, except for a slight decrease in rate at stage 5, attributable to the recovery of muscle reserves.

  17. Strategies for Integrating Content from the USGCRP Climate and Health Assessment into the K-12 Classroom

    Science.gov (United States)

    Haine, D. B.

    2016-12-01

    That the physical environment shapes the lives and behaviors of people is certainly not news, but communicating the impact of a changing climate on human health and predicting the trajectory of these changes is an active area of study in public health. From air quality concerns to extreme heat to shifts in the range of disease vectors, there are many opportunities to make connections between Earth's changing climate and human health. While many science teachers understand that addressing human health impacts as a result of a changing climate can provide needed relevance, it can be challenging for teachers to do so given an already packed curriculum. This session will share instructional strategies for integrating content from the USGCRP Climate and Health Assessment (CHA) by enhancing, rather than displacing content related to climate science. This presentation will feature a data interpretation activity developed in collaboration with geoscientists at the University of North Carolina's Gillings School of Public Health to convey the connection between air quality, climate change and human health. This classroom activity invites students to read excerpts from the CHA and interpret data presented in the scientific literature, thus promoting scientific literacy. In summarizing this activity, I will highlight strategies for effectively engaging geoscientists in developing scientifically rigorous, STEM-focused educational activities that are aligned to state and national science standards and also address the realities of the science classroom. Collaborating with geoscientists and translating their research into classroom activities is an approach that becomes more pertinent with the advent of the Next Generation Science Standards (NGSS). Thus, the USGCRP Climate and Health Assessment represents an opportunity to cultivate science literacy among K-12 students while providing relevant learning experiences that promote integration of science and engineering practices as

  18. 7 CFR 51.2561 - Average moisture content.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Average moisture content. 51.2561 Section 51.2561... STANDARDS) United States Standards for Grades of Shelled Pistachio Nuts § 51.2561 Average moisture content. (a) Determining average moisture content of the lot is not a requirement of the grades, except when...

  19. Training Standardization

    International Nuclear Information System (INIS)

    Agnihotri, Newal

    2003-01-01

    The article describes the benefits of and required process and recommendations for implementing the standardization of training in the nuclear power industry in the United States and abroad. Current Information and Communication Technologies (ICT) enable training standardization in the nuclear power industry. The delivery of training through the Internet, Intranet and video over IP will facilitate this standardization and bring multiple benefits to the nuclear power industry worldwide. As the amount of available qualified and experienced professionals decreases because of retirements and fewer nuclear engineering institutions, standardized training will help increase the number of available professionals in the industry. Technology will make it possible to use the experience of retired professionals who may be interested in working part-time from a remote location. Well-planned standardized training will prevent a fragmented approach among utilities, and it will save the industry considerable resources in the long run. It will also ensure cost-effective and safe nuclear power plant operation

  20. Seizing the Future: How Ohio's Career-Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Careers

    Science.gov (United States)

    Guarino, Heidi; Yoder, Shaun

    2015-01-01

    "Seizing the Future: How Ohio's Career and Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Work," demonstrates Ohio's progress in developing strong policies for career and technical education (CTE) programs to promote rigor, including college- and career-ready graduation…

  1. State Standards and State Assessment Systems: A Guide to Alignment. Series on Standards and Assessments.

    Science.gov (United States)

    La Marca, Paul M.; Redfield, Doris; Winter, Phoebe C.

    Alignment of content standards, performance standards, and assessments is crucial. This guide contains information to assist states and districts in aligning their assessment systems to their content and performance standards. It includes a review of current literature, both published and fugitive. The research is woven together with a few basic…

  2. Effluent standards

    Energy Technology Data Exchange (ETDEWEB)

    Geisler, G C [Pennsylvania State University (United States)

    1974-07-01

    At the conference there was a considerable interest in research reactor standards and effluent standards in particular. On the program, this is demonstrated by the panel discussion on effluents, the paper on argon 41 measured by Sims, and the summary paper by Ringle, et al. on the activities of ANS research reactor standards committee (ANS-15). As a result, a meeting was organized to discuss the proposed ANS standard on research reactor effluents (15.9). This was held on Tuesday evening, was attended by members of the ANS-15 committee who were present at the conference, participants in the panel discussion on the subject, and others interested. Out of this meeting came a number of excellent suggestions for changes which will increase the utility of the standard, and a strong recommendation that the effluent standard (15.9) be combined with the effluent monitoring standard. It is expected that these suggestions and recommendations will be incorporated and a revised draft issued for comment early this summer. (author)

  3. Nuclear standards

    International Nuclear Information System (INIS)

    Fichtner, N.; Becker, K.; Bashir, M.

    1981-01-01

    This compilation of all nuclear standards available to the authors by mid 1980 represents the third, carefully revised edition of a catalogue which was first published in 1975 as EUR 5362. In this third edition several changes have been made. The title has been condensed. The information has again been carefully up-dated, covering all changes regarding status, withdrawal of old standards, new projects, amendments, revisions, splitting of standards into several parts, combination of several standards into one, etc., as available to the authors by mid 1980. The speed with which information travels varies and requires in many cases rather tedious and cumbersome inquiries. Also, the classification scheme has been revised with the goal of better adjustment to changing situations and priorities. Whenever it turned out to be difficult to attribute a standard to a single subject category, multiple listings in all relevant categories have been made. As in previous editions, within the subcategories the standards are arranged by organization (in Categorie 2.1 by country) alphabetically and in ascending numerical order. It covers all relevant areas of power reactors, the fuel cycle, radiation protection, etc., from the basic laws and governmental regulations, regulatory guides, etc., all the way to voluntary industrial standards and codes of pratice. (orig./HP)

  4. Towards an international address standard

    CSIR Research Space (South Africa)

    Coetzee, S

    2008-02-01

    Full Text Available in a better user experience. Standards compliance allows for the separation of concerns: HTML for content, Cascading Style Sheets (CSS) for presentation and JavaScript for dynamic behaviour. Standards compliant documents are also...) and cascading style sheets through CSS (CSS n.d.), whilst the JavaScript specification has been standardised by Ecma International (another standards organisation for information and communication systems), in the form of EcmaScript (Ecma...

  5. Informing Estimates of Program Effects for Studies of Mathematics Professional Development Using Teacher Content Knowledge Outcomes.

    Science.gov (United States)

    Phelps, Geoffrey; Kelcey, Benjamin; Jones, Nathan; Liu, Shuangshuang

    2016-10-03

    Mathematics professional development is widely offered, typically with the goal of improving teachers' content knowledge, the quality of teaching, and ultimately students' achievement. Recently, new assessments focused on mathematical knowledge for teaching (MKT) have been developed to assist in the evaluation and improvement of mathematics professional development. This study presents empirical estimates of average program change in MKT and its variation with the goal of supporting the design of experimental trials that are adequately powered to detect a specified program effect. The study drew on a large database representing five different assessments of MKT and collectively 326 professional development programs and 9,365 teachers. Results from cross-classified hierarchical growth models found that standardized average change estimates across the five assessments ranged from a low of 0.16 standard deviations (SDs) to a high of 0.26 SDs. Power analyses using the estimated pre- and posttest change estimates indicated that hundreds of teachers are needed to detect changes in knowledge at the lower end of the distribution. Even studies powered to detect effects at the higher end of the distribution will require substantial resources to conduct rigorous experimental trials. Empirical benchmarks that describe average program change and its variation provide a useful preliminary resource for interpreting the relative magnitude of effect sizes associated with professional development programs and for designing adequately powered trials. © The Author(s) 2016.

  6. A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration

    Directory of Open Access Journals (Sweden)

    Alireza G. Kashani

    2015-11-01

    Full Text Available In addition to precise 3D coordinates, most light detection and ranging (LIDAR systems also record “intensity”, loosely defined as the strength of the backscattered echo for each measured point. To date, LIDAR intensity data have proven beneficial in a wide range of applications because they are related to surface parameters, such as reflectance. While numerous procedures have been introduced in the scientific literature, and even commercial software, to enhance the utility of intensity data through a variety of “normalization”, “correction”, or “calibration” techniques, the current situation is complicated by a lack of standardization, as well as confusing, inconsistent use of terminology. In this paper, we first provide an overview of basic principles of LIDAR intensity measurements and applications utilizing intensity information from terrestrial, airborne topographic, and airborne bathymetric LIDAR. Next, we review effective parameters on intensity measurements, basic theory, and current intensity processing methods. We define terminology adopted from the most commonly-used conventions based on a review of current literature. Finally, we identify topics in need of further research. Ultimately, the presented information helps lay the foundation for future standards and specifications for LIDAR radiometric calibration.

  7. A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration.

    Science.gov (United States)

    Kashani, Alireza G; Olsen, Michael J; Parrish, Christopher E; Wilson, Nicholas

    2015-11-06

    In addition to precise 3D coordinates, most light detection and ranging (LIDAR) systems also record "intensity", loosely defined as the strength of the backscattered echo for each measured point. To date, LIDAR intensity data have proven beneficial in a wide range of applications because they are related to surface parameters, such as reflectance. While numerous procedures have been introduced in the scientific literature, and even commercial software, to enhance the utility of intensity data through a variety of "normalization", "correction", or "calibration" techniques, the current situation is complicated by a lack of standardization, as well as confusing, inconsistent use of terminology. In this paper, we first provide an overview of basic principles of LIDAR intensity measurements and applications utilizing intensity information from terrestrial, airborne topographic, and airborne bathymetric LIDAR. Next, we review effective parameters on intensity measurements, basic theory, and current intensity processing methods. We define terminology adopted from the most commonly-used conventions based on a review of current literature. Finally, we identify topics in need of further research. Ultimately, the presented information helps lay the foundation for future standards and specifications for LIDAR radiometric calibration.

  8. Type B Package Radioactive Material Contents Compliance

    International Nuclear Information System (INIS)

    HENSEL, STEVE

    2006-01-01

    Implementation of packaging and transportation requirements can be subdivided into three categories; contents compliance, packaging closure, and transportation or logistical compliance. This paper addresses the area of contents compliance within the context of regulations, DOE Orders, and appropriate standards. Common practices and current pitfalls are also discussed

  9. Quantification of Fluorine Content in AFFF Concentrates

    Science.gov (United States)

    2017-09-29

    for MilSpec compliance. Fluorocarbon surfactants are the most active components in these concentrates, and analysis of the fluorine content in the... physical requirements for AFFF concentrates includes a total fluorine content determination and a requirement for subsequent evaluations of this AFFF...the standard for fluorine content as well as the reference for chemical shift. For preparation of an NMR solution, it is important that the TFE

  10. MATE standardization

    Science.gov (United States)

    Farmer, R. E.

    1982-11-01

    The MATE (Modular Automatic Test Equipment) program was developed to combat the proliferation of unique, expensive ATE within the Air Force. MATE incorporates a standard management approach and a standard architecture designed to implement a cradle-to-grave approach to the acquisition of ATE and to significantly reduce the life cycle cost of weapons systems support. These standards are detailed in the MATE Guides. The MATE Guides assist both the Air Force and Industry in implementing the MATE concept, and provide the necessary tools and guidance required for successful acquisition of ATE. The guides also provide the necessary specifications for industry to build MATE-qualifiable equipment. The MATE architecture provides standards for all key interfaces of an ATE system. The MATE approach to the acquisition and management of ATE has been jointly endorsed by the commanders of Air Force Systems Command and Air Force Logistics Command as the way of doing business in the future.

  11. Rigorous Photogrammetric Processing of CHANG'E-1 and CHANG'E-2 Stereo Imagery for Lunar Topographic Mapping

    Science.gov (United States)

    Di, K.; Liu, Y.; Liu, B.; Peng, M.

    2012-07-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF) coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1) refining EOPs by correcting the attitude angle bias, 2) refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model) and DOM (Digital Ortho Map) are automatically generated.

  12. Muscle pH, rigor mortis and blood variables in Atlantic salmon transported in two types of well-boat.

    Science.gov (United States)

    Gatica, M C; Monti, G E; Knowles, T G; Gallo, C B

    2010-01-09

    Two systems for transporting live salmon (Salmo salar) were compared in terms of their effects on blood variables, muscle pH and rigor index: an 'open system' well-boat with recirculated sea water at 13.5 degrees C and a stocking density of 107 kg/m3 during an eight-hour journey, and a 'closed system' well-boat with water chilled from 16.7 to 2.1 degrees C and a stocking density of 243.7 kg/m3 during a seven-hour journey. Groups of 10 fish were sampled at each of four stages: in cages at the farm, in the well-boat after loading, in the well-boat after the journey and before unloading, and in the processing plant after they were pumped from the resting cages. At each sampling, the fish were stunned and bled by gill cutting. Blood samples were taken to measure lactate, osmolality, chloride, sodium, cortisol and glucose, and their muscle pH and rigor index were measured at death and three hours later. In the open system well-boat, the initial muscle pH of the fish decreased at each successive stage, and at the final stage they had a significantly lower initial muscle pH and more rapid onset of rigor than the fish transported on the closed system well-boat. At the final stage all the blood variables except glucose were significantly affected in the fish transported on both types of well-boat.

  13. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    Directory of Open Access Journals (Sweden)

    K. Di

    2012-07-01

    Full Text Available Chang'E-1(CE-1 and Chang'E-2(CE-2 are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1 refining EOPs by correcting the attitude angle bias, 2 refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model and DOM (Digital Ortho Map are automatically generated.

  14. Study designs for identifying risk compensation behavior among users of biomedical HIV prevention technologies: balancing methodological rigor and research ethics.

    Science.gov (United States)

    Underhill, Kristen

    2013-10-01

    The growing evidence base for biomedical HIV prevention interventions - such as oral pre-exposure prophylaxis, microbicides, male circumcision, treatment as prevention, and eventually prevention vaccines - has given rise to concerns about the ways in which users of these biomedical products may adjust their HIV risk behaviors based on the perception that they are prevented from infection. Known as risk compensation, this behavioral adjustment draws on the theory of "risk homeostasis," which has previously been applied to phenomena as diverse as Lyme disease vaccination, insurance mandates, and automobile safety. Little rigorous evidence exists to answer risk compensation concerns in the biomedical HIV prevention literature, in part because the field has not systematically evaluated the study designs available for testing these behaviors. The goals of this Commentary are to explain the origins of risk compensation behavior in risk homeostasis theory, to reframe risk compensation as a testable response to the perception of reduced risk, and to assess the methodological rigor and ethical justification of study designs aiming to isolate risk compensation responses. Although the most rigorous methodological designs for assessing risk compensation behavior may be unavailable due to ethical flaws, several strategies can help investigators identify potential risk compensation behavior during Phase II, Phase III, and Phase IV testing of new technologies. Where concerns arise regarding risk compensation behavior, empirical evidence about the incidence, types, and extent of these behavioral changes can illuminate opportunities to better support the users of new HIV prevention strategies. This Commentary concludes by suggesting a new way to conceptualize risk compensation behavior in the HIV prevention context. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Home media server content management

    Science.gov (United States)

    Tokmakoff, Andrew A.; van Vliet, Harry

    2001-07-01

    With the advent of set-top boxes, the convergence of TV (broadcasting) and PC (Internet) is set to enter the home environment. Currently, a great deal of activity is occurring in developing standards (TV-Anytime Forum) and devices (TiVo) for local storage on Home Media Servers (HMS). These devices lie at the heart of convergence of the triad: communications/networks - content/media - computing/software. Besides massive storage capacity and being a communications 'gateway', the home media server is characterised by the ability to handle metadata and software that provides an easy to use on-screen interface and intelligent search/content handling facilities. In this paper, we describe a research prototype HMS that is being developed within the GigaCE project at the Telematica Instituut . Our prototype demonstrates advanced search and retrieval (video browsing), adaptive user profiling and an innovative 3D component of the Electronic Program Guide (EPG) which represents online presence. We discuss the use of MPEG-7 for representing metadata, the use of MPEG-21 working draft standards for content identification, description and rights expression, and the use of HMS peer-to-peer content distribution approaches. Finally, we outline explorative user behaviour experiments that aim to investigate the effectiveness of the prototype HMS during development.

  16. Rigorous lower bounds on the imaginary parts of the scattering amplitudes and the positions of their zeros

    CERN Document Server

    Uchiyama, T

    1974-01-01

    Rigorous lower bounds are derived from axiomatic field theory, by invoking analyticity and unitarity of the S-matrix. The bounds are expressed in terms of the total cross section and the slope parameter, and are found to be compatible with CERN experimental pp scattering data. It is also shown that the calculated lower-bound values imply non-existence of zeros for -t

  17. Digital Content Strategies

    OpenAIRE

    Halbheer, Daniel; Stahl, Florian; Koenigsberg, Oded; Lehmann, Donald R

    2013-01-01

    This paper studies content strategies for online publishers of digital information goods. It examines sampling strategies and compares their performance to paid content and free content strategies. A sampling strategy, where some of the content is offered for free and consumers are charged for access to the rest, is known as a "metered model" in the newspaper industry. We analyze optimal decisions concerning the size of the sample and the price of the paid content when sampling serves the dua...

  18. Fight the power: the limits of empiricism and the costs of positivistic rigor.

    Science.gov (United States)

    Indick, William

    2002-01-01

    A summary of the influence of positivistic philosophy and empiricism on the field of psychology is followed by a critique of the empirical method. The dialectic process is advocated as an alternative method of inquiry. The main advantage of the dialectic method is that it is open to any logical argument, including empirical hypotheses, but unlike empiricism, it does not automatically reject arguments that are not based on observable data. Evolutionary and moral psychology are discussed as examples of important fields of study that could benefit from types of arguments that frequently do not conform to the empirical standards of systematic observation and falsifiability of hypotheses. A dialectic method is shown to be a suitable perspective for those fields of research, because it allows for logical arguments that are not empirical and because it fosters a functionalist perspective, which is indispensable for both evolutionary and moral theories. It is suggested that all psychologists may gain from adopting a dialectic approach, rather than restricting themselves to empirical arguments alone.

  19. Frequency standards

    CERN Document Server

    Riehle, Fritz

    2006-01-01

    Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards

  20. Relevant Standards

    Indian Academy of Sciences (India)

    .86: Ethernet over LAPS. Standard in China and India. G.7041: Generic Framing Procedure (GFP). Supports Ethernet as well as other data formats (e.g., Fibre Channel); Protocol of ... IEEE 802.3x for flow control of incoming Ethernet data ...

  1. Achieving Standardization

    DEFF Research Database (Denmark)

    Henningsson, Stefan

    2014-01-01

    International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...

  2. Achieving Standardization

    DEFF Research Database (Denmark)

    Henningsson, Stefan

    2016-01-01

    International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...

  3. Standard Fortran

    International Nuclear Information System (INIS)

    Marshall, N.H.

    1981-01-01

    Because of its vast software investment in Fortran programs, the nuclear community has an inherent interest in the evolution of Fortran. This paper reviews the impact of the new Fortran 77 standard and discusses the projected changes which can be expected in the future

  4. Why so many "rigorous" evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute.

    Science.gov (United States)

    Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene

    2016-04-01

    Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Improved rigorous upper bounds for transport due to passive advection described by simple models of bounded systems

    International Nuclear Information System (INIS)

    Kim, Chang-Bae; Krommes, J.A.

    1988-08-01

    The work of Krommes and Smith on rigorous upper bounds for the turbulent transport of a passively advected scalar [/ital Ann. Phys./ 177:246 (1987)] is extended in two directions: (1) For their ''reference model,'' improved upper bounds are obtained by utilizing more sophisticated two-time constraints which include the effects of cross-correlations up to fourth order. Numerical solutions of the model stochastic differential equation are also obtained; they show that the new bounds compare quite favorably with the exact results, even at large Reynolds and Kubo numbers. (2) The theory is extended to take account of a finite spatial autocorrelation length L/sub c/. As a reasonably generic example, the problem of particle transport due to statistically specified stochastic magnetic fields in a collisionless turbulent plasma is revisited. A bound is obtained which reduces for small L/sub c/ to the quasilinear limit and for large L/sub c/ to the strong turbulence limit, and which provides a reasonable and rigorous interpolation for intermediate values of L/sub c/. 18 refs., 6 figs

  6. Smoothing of Transport Plans with Fixed Marginals and Rigorous Semiclassical Limit of the Hohenberg-Kohn Functional

    Science.gov (United States)

    Cotar, Codina; Friesecke, Gero; Klüppelberg, Claudia

    2018-06-01

    We prove rigorously that the exact N-electron Hohenberg-Kohn density functional converges in the strongly interacting limit to the strictly correlated electrons (SCE) functional, and that the absolute value squared of the associated constrained search wavefunction tends weakly in the sense of probability measures to a minimizer of the multi-marginal optimal transport problem with Coulomb cost associated to the SCE functional. This extends our previous work for N = 2 ( Cotar etal. in Commun Pure Appl Math 66:548-599, 2013). The correct limit problem has been derived in the physics literature by Seidl (Phys Rev A 60 4387-4395, 1999) and Seidl, Gorigiorgi and Savin (Phys Rev A 75:042511 1-12, 2007); in these papers the lack of a rigorous proofwas pointed out.We also give amathematical counterexample to this type of result, by replacing the constraint of given one-body density—an infinite dimensional quadratic expression in the wavefunction—by an infinite-dimensional quadratic expression in the wavefunction and its gradient. Connections with the Lawrentiev phenomenon in the calculus of variations are indicated.

  7. The Researchers' View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research.

    Science.gov (United States)

    Reichlin, Thomas S; Vogt, Lucile; Würbel, Hanno

    2016-01-01

    Reproducibility in animal research is alarmingly low, and a lack of scientific rigor has been proposed as a major cause. Systematic reviews found low reporting rates of measures against risks of bias (e.g., randomization, blinding), and a correlation between low reporting rates and overstated treatment effects. Reporting rates of measures against bias are thus used as a proxy measure for scientific rigor, and reporting guidelines (e.g., ARRIVE) have become a major weapon in the fight against risks of bias in animal research. Surprisingly, animal scientists have never been asked about their use of measures against risks of bias and how they report these in publications. Whether poor reporting reflects poor use of such measures, and whether reporting guidelines may effectively reduce risks of bias has therefore remained elusive. To address these questions, we asked in vivo researchers about their use and reporting of measures against risks of bias and examined how self-reports relate to reporting rates obtained through systematic reviews. An online survey was sent out to all registered in vivo researchers in Switzerland (N = 1891) and was complemented by personal interviews with five representative in vivo researchers to facilitate interpretation of the survey results. Return rate was 28% (N = 530), of which 302 participants (16%) returned fully completed questionnaires that were used for further analysis. According to the researchers' self-report, they use measures against risks of bias to a much greater extent than suggested by reporting rates obtained through systematic reviews. However, the researchers' self-reports are likely biased to some extent. Thus, although they claimed to be reporting measures against risks of bias at much lower rates than they claimed to be using these measures, the self-reported reporting rates were considerably higher than reporting rates found by systematic reviews. Furthermore, participants performed rather poorly when asked to

  8. Rigor, Reliability, and Scientific Relevance: Citizen Science Lessons from COASST (Invited)

    Science.gov (United States)

    Parrish, J. K.

    2013-12-01

    Citizen science promises fine grain, broad extent data collected over decadal time scales, with co-benefits including increased scientific literacy and civic engagement. But does it only deliver non-standardized, unverifiable data collected episodically by individuals with little-to-no training? How do you know which projects to trust? What are the attributes of a scientifically sound citizen science project? The Coastal Observation and Seabird Survey Team (COASST) is a 15 year old citizen science project currently involving ~800 participants from northern California north to Kotzebue, Alaska and west to the Commander Islands, Russia. After a single 5-hour training delivered in-community by an expert, volunteers have the knowledge and skill sets to accurately survey a coastal site for beached bird carcasses, which they will be able to identify to species correctly ~85% of the time. Data are collected monthly, and some volunteers remain with the program for years, contributing hundreds, even thousands, of survey hours. COASST trainings, data collection materials, and data entry web portal all reinforce 'evidence first, deduction second,' a maxim that allows volunteers to learn, and gives on-staff experts the ability to independently verify all birds found. COASST data go directly into science, as part of studies as diverse as fishery entanglement, historic native uses of seabirds as food sources, and the impacts of sudden shifts in upwelling; as well as into resource management, as part of decisions on fishing regulations, waterfowl hunting limits, and ESA-listed species management. Like professional science, COASST features a specific sampling design linked to questions of interest, verifiable data, statistical analysis, and peer-reviewed publication. In addition, COASST features before-and-after testing of volunteer knowledge, independent verification of all deductive data, and recruitment and retention strategies linked to geographic community norms. As a result

  9. Injection-salting and cold-smoking of farmed atlantic cod (Gadus morhua L.) and Atlantic salmon (Salmo salar L.) at different stages of Rigor Mortis: effect on physical properties.

    Science.gov (United States)

    Akse, L; Birkeland, S; Tobiassen, T; Joensen, S; Larsen, R

    2008-10-01

    Processing of fish is generally conducted postrigor, but prerigor processing is associated with some potential advantages. The aim of this study was to study how 5 processing regimes of cold-smoked cod and salmon conducted at different stages of rigor influenced yield, fillet shrinkage, and gaping. Farmed cod and salmon was filleted, salted by brine injection of 25% NaCl, and smoked for 2 h at different stages of rigor. Filleting and salting prerigor resulted in increased fillet shrinkage and less increase in weight during brine injection, which in turn was correlated to the salt content of the fillet. These effects were more pronounced in cod fillets when compared to salmon. Early processing reduced fillet gaping and fillets were evaluated as having a firmer texture. In a follow-up trial with cod, shrinkage and weight gain during injection was studied as an effect of processing time postmortem. No changes in weight gain were observed for fillets salted the first 24 h postmortem; however, by delaying the processing 12 h postmortem, the high and rapid shrinking of cod fillets during brine injection was halved.

  10. Publishing and Revising Content

    Science.gov (United States)

    Editors and Webmasters can publish content without going through a workflow. Publishing times and dates can be set, and multiple pages can be published in bulk. Making an edit to published content created a revision.

  11. The influence of low temperature, type of muscle and electrical stimulation on the course of rigor mortis, ageing and tenderness of beef muscles.

    Science.gov (United States)

    Olsson, U; Hertzman, C; Tornberg, E

    1994-01-01

    The course of rigor mortis, ageing and tenderness have been evaluated for two beef muscles, M. semimembranosus (SM) and M. longissimus dorsi (LD), when entering rigor at constant temperatures in the cold-shortening region (1, 4, 7 and 10°C). The influence of electrical stimulation (ES) was also examined. Post-mortem changes were registered by shortening and isometric tension and by following the decline of pH, ATP and creatine phosphate. The effect of ageing on tenderness was recorded by measuring shear-force (2, 8 and 15 days post mortem) and the sensory properties were assessed 15 days post mortem. It was found that shortening increased with decreasing temperature, resulting in decreased tenderness. Tenderness for LD, but not for SM, was improved by ES at 1 and 4°C, whereas ES did not give rise to any decrease in the degree of shortening during rigor mortis development. This suggests that ES influences tenderization more than it prevents cold-shortening. The samples with a pre-rigor mortis temperature of 1°C could not be tenderized, when stored up to 15 days, whereas this was the case for the muscles entering rigor mortis at the other higher temperatures. The results show that under the conditions used in this study, the course of rigor mortis is more important for the ultimate tenderness than the course of ageing. Copyright © 1994. Published by Elsevier Ltd.

  12. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    Energy Technology Data Exchange (ETDEWEB)

    Ngampitipan, Tritos, E-mail: tritos.ngampitipan@gmail.com [Faculty of Science, Chandrakasem Rajabhat University, Ratchadaphisek Road, Chatuchak, Bangkok 10900 (Thailand); Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Boonserm, Petarpa, E-mail: petarpa.boonserm@gmail.com [Department of Mathematics and Computer Science, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Chatrabhuti, Auttakit, E-mail: dma3ac2@gmail.com [Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Visser, Matt, E-mail: matt.visser@msor.vuw.ac.nz [School of Mathematics, Statistics, and Operations Research, Victoria University of Wellington, PO Box 600, Wellington 6140 (New Zealand)

    2016-06-02

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  13. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    International Nuclear Information System (INIS)

    Ngampitipan, Tritos; Boonserm, Petarpa; Chatrabhuti, Auttakit; Visser, Matt

    2016-01-01

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  14. Plasmaspheric electron content

    International Nuclear Information System (INIS)

    Hartmann, G.K.

    1978-01-01

    Measurements of the plasmaspheric electron content are reviewed with particular reference to the ATS-6 radio beacon experiment. From the review, it appears likely that measurement of the plasmaspheric electron content is the only one capable of monitoring electron fluxes continuously between L 1 and L 2. Some recent important results deduced from plasmaspheric electron content measurements are discussed

  15. International Financial Reporting Standards for SMEs

    Directory of Open Access Journals (Sweden)

    Cicilia IONESCU

    2011-06-01

    Full Text Available • IFRS for SMEs give the response to an international requirement of the developed and emerging economies in the process of globalization to have a rigorous and common set of international accounting provisions (standard, rule, regulation specifically for SMEs, to be much more simplified than the complete IFRSs. The area where are applicable the IFRS for SMEs includes all the profit-oriented entities that prepare the general financial statements and do not have the public accountability and there are excluded the entities whose securities are publicly traded and the financial institutions like banks and the insurance companies.

  16. Engineering Education: A Clear Content Base for Standards

    Science.gov (United States)

    Grubbs, Michael E.; Strimel, Greg J.; Huffman, Tanner

    2018-01-01

    Interest in engineering at the P-12 level has increased in recent years, largely in response to STEM educational reform. Despite greater attention to the value, importance, and use of engineering for teaching and learning, the educational community has engaged minimally in its deliberate and coherent study. Specifically, few efforts have been…

  17. Nuclear radiation moisture gauge calibration standard

    International Nuclear Information System (INIS)

    1977-01-01

    A hydrophobic standard for calibrating nuclear radiation moisture gauges is described. Each standard has physical characteristics and dimensions effective for representing to a nuclear gauge undergoing calibration, an infinite mass of homogeneous hydrogen content. Calibration standards are discussed which are suitable for use with surface gauges and with depth gauges. (C.F.)

  18. A rigorous approach to investigating common assumptions about disease transmission: Process algebra as an emerging modelling methodology for epidemiology.

    Science.gov (United States)

    McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron

    2011-03-01

    Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.

  19. pd Scattering Using a Rigorous Coulomb Treatment: Reliability of the Renormalization Method for Screened-Coulomb Potentials

    International Nuclear Information System (INIS)

    Hiratsuka, Y.; Oryu, S.; Gojuki, S.

    2011-01-01

    Reliability of the screened Coulomb renormalization method, which was proposed in an elegant way by Alt-Sandhas-Zankel-Ziegelmann (ASZZ), is discussed on the basis of 'two-potential theory' for the three-body AGS equations with the Coulomb potential. In order to obtain ASZZ's formula, we define the on-shell Moller function, and calculate it by using the Haeringen criterion, i. e. 'the half-shell Coulomb amplitude is zero'. By these two steps, we can finally obtain the ASZZ formula for a small Coulomb phase shift. Furthermore, the reliability of the Haeringen criterion is thoroughly checked by a numerically rigorous calculation for the Coulomb LS-type equation. We find that the Haeringen criterion can be satisfied only in the higher energy region. We conclude that the ASZZ method can be verified in the case that the on-shell approximation to the Moller function is reasonable, and the Haeringen criterion is reliable. (author)

  20. Forced oral opening for cadavers with rigor mortis: two approaches for the myotomy on the temporal muscles.

    Science.gov (United States)

    Nakayama, Y; Aoki, Y; Niitsu, H; Saigusa, K

    2001-04-15

    Forensic dentistry plays an essential role in personal identification procedures. An adequate interincisal space of cadavers with rigor mortis is required to obtain detailed dental findings. We have developed intraoral and two directional approaches, for myotomy of the temporal muscles. The intraoral approach, in which the temporalis was dissected with scissors inserted via an intraoral incision, was adopted for elderly cadavers, females and emaciated or exhausted bodies, and had a merit of no incision on the face. The two directional approach, in which myotomy was performed with thread-wire saw from behind and with scissors via the intraoral incision, was designed for male muscular youths. Both approaches were effective to obtain a desired degree of an interincisal opening without facial damage.

  1. Optical Properties of Complex Plasmonic Materials Studied with Extended Effective Medium Theories Combined with Rigorous Coupled Wave Analysis

    Directory of Open Access Journals (Sweden)

    Elie Nadal

    2018-02-01

    Full Text Available In this study we fabricate gold nanocomposites and model their optical properties. The nanocomposites are either homogeneous films or gratings containing gold nanoparticles embedded in a polymer matrix. The samples are fabricated using a recently developed technique making use of laser interferometry. The gratings present original plasmon-enhanced diffraction properties. In this work, we develop a new approach to model the optical properties of our composites. We combine the extended Maxwell–Garnett model of effective media with the Rigorous Coupled Wave Analysis (RCWA method and compute both the absorption spectra and the diffraction efficiency spectra of the gratings. We show that such a semi-analytical approach allows us to reproduce the original plasmonic features of the composites and can provide us with details about their inner structure. Such an approach, considering reasonably high particle concentrations, could be a simple and efficient tool to study complex micro-structured system based on plasmonic components, such as metamaterials.

  2. Rigorous project for existing houses. Energy conservation requires evolution; Rigoureus project voor bestaande woningen. Evolutie voor energiebesparing nodig

    Energy Technology Data Exchange (ETDEWEB)

    Clocquet, R. [DHV, Amersfoort (Netherlands); Koene, F. [ECN Efficiency and Infrastructure, Petten (Netherlands)

    2010-05-15

    How can existing terraced houses be renovated in such a way that their energy use decreases with 75 percent? The Rigorous project of the Energy research Centre of the Netherlands (ECN), TNO, Delft University of Technology and DHV, developed innovative renovation concepts that make such savings feasible by combining constructional measures with installation concepts. On top of that it is also essential that consumer behavior is addressed. [Dutch] Hoe kunnen bestaande rijtjeswoningen zo worden gerenoveerd dat het totale energiegebruik met 75 procent afneemt? In het Rigoureus-project hebben ECN, TNO, TU Delft en DHV innovatieve renovatieconcepten ontwikkeld die dat, door een combinatie van bouwkundige maatregelen en uitgeldende installatieconcepten, mogelijk maken. Daarbij blijkt het van essentieel belang ook het gebruikersgedrag aan te pakken.

  3. Rigorous derivation of the mean-field green functions of the two-band Hubbard model of superconductivity

    International Nuclear Information System (INIS)

    Adam, G.; Adam, S.

    2007-01-01

    The Green function (GF) equation of motion technique for solving the effective two-band Hubbard model of high-T c superconductivity in cuprates rests on the Hubbard operator (HO) algebra. We show that, if we take into account the invariance to translations and spin reversal, the HO algebra results in invariance properties of several specific correlation functions. The use of these properties allows rigorous derivation and simplification of the expressions of the frequency matrix (FM) and of the generalized mean-field approximation (GMFA) Green functions (GFs) of the model. For the normal singlet hopping and anomalous exchange pairing correlation functions which enter the FM and GMFA-GFs, the use of spectral representations allows the identification and elimination of exponentially small quantities. This procedure secures the reduction of the correlation order to the GMFA-GF expressions

  4. Characterizing the toughness of an epoxy resin after wet aging using compact tension specimens with non-uniform moisture content

    KAUST Repository

    Quino, Gustavo; El Yagoubi, Jalal; Lubineau, Gilles

    2014-01-01

    Characterizing the change in toughness of polymers subjected to wet aging is challenging because of the heterogeneity of the testing samples. Indeed, as wet aging is guided by a diffusion/reaction process, compact tension samples (defined by the ASTM D5045 standard), which are relevant for toughness characterization but are somewhat thick, display a non-uniform moisture content over the bulk material. We define here a rigorous procedure to extract meaningful data from such tests. Our results showed that the relation between the moisture uptake of the whole sample and the measured toughness was not a meaningful material property. In fact, we found that the measured toughness depended on the locally varying moisture uptake over the cracking path. Here, we propose a post-processing technique that relies on a validated reaction/diffusion model to predict the three-dimensional moisture state of the epoxy. This makes identification of the variation in toughness with respect to the local moisture content possible. In addition, we analyze the fracture surface using micrography and roughness measurements. The observed variations in toughness are correlated with the roughness in the vicinity of the crack tip. © 2014 Elsevier Ltd. All rights rese.

  5. Characterizing the toughness of an epoxy resin after wet aging using compact tension specimens with non-uniform moisture content

    KAUST Repository

    Quino, Gustavo

    2014-11-01

    Characterizing the change in toughness of polymers subjected to wet aging is challenging because of the heterogeneity of the testing samples. Indeed, as wet aging is guided by a diffusion/reaction process, compact tension samples (defined by the ASTM D5045 standard), which are relevant for toughness characterization but are somewhat thick, display a non-uniform moisture content over the bulk material. We define here a rigorous procedure to extract meaningful data from such tests. Our results showed that the relation between the moisture uptake of the whole sample and the measured toughness was not a meaningful material property. In fact, we found that the measured toughness depended on the locally varying moisture uptake over the cracking path. Here, we propose a post-processing technique that relies on a validated reaction/diffusion model to predict the three-dimensional moisture state of the epoxy. This makes identification of the variation in toughness with respect to the local moisture content possible. In addition, we analyze the fracture surface using micrography and roughness measurements. The observed variations in toughness are correlated with the roughness in the vicinity of the crack tip. © 2014 Elsevier Ltd. All rights rese.

  6. Importance of All-in-one (MCNPX2.7.0+CINDER2008) Code for Rigorous Transmutation Study

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Oyeon [Institute for Modeling and Simulation Convergence, Daegu (Korea, Republic of); Kim, Kwanghyun [RadTek Co. Ltd., Daejeon (Korea, Republic of)

    2015-10-15

    It can be utilized as a possible mechanism for reducing the volume and hazard of radioactive waste by transforming hazardous radioactive elements with long half-life into less hazardous elements with short halflife. Thus, the understanding of the transmutation mechanism and beneficial machinery design technologies are important and useful. Although the terminology transmutation was rooted back to alchemy which transforms the base metals into gold in the middle ages, Rutherford and Soddy were the first observers by discovering the natural transmutation as a part of radioactive decay of the alpha decay type in early 20th century. Along with the development of computing technology, analysis software, for example, CINDER was developed for rigorous atomic transmutation study. The code has a long history of development from the original work of T. England at Bettis Atomic Power Laboratory (BAPL) in the early 1960s. It has been used to calculate the inventory of nuclides in an irradiated material. CINDER'90 which is recently released involved an upgrade of the code to allow the spontaneous tracking of chains based upon the significant density or pass-by of a nuclide, where pass-by represents the density of a nuclide transforming to other nuclides. Nuclear transmutation process is governed by highly non-linear differential equation. Chaotic nature of the non-linear equation bespeaks the importance of the accurate input data (i.e. number of significant digits). Thus, reducing the human interrogation is very important for the rigorous transmutation study and 'allin- one' code structure is desired. Note that non-linear characteristic of the transmutation equation caused by the flux changes due to the number density change during a given time interval (intrinsic physical phenomena) is not considered in this study. In this study, we only emphasized the effects of human interrogation in the computing process solving nonlinear differential equations, as shown in

  7. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    Science.gov (United States)

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  8. Meat quality and rigor mortis development in broiler chickens with gas-induced anoxia and postmortem electrical stimulation.

    Science.gov (United States)

    Sams, A R; Dzuik, C S

    1999-10-01

    This study was conducted to evaluate the combined rigor-accelerating effects of postmortem electrical stimulation (ES) and argon-induced anoxia (Ar) of broiler chickens. One hundred broilers were processed in the following treatments: untreated controls, ES, Ar, or Ar with ES (Ar + ES). Breast fillets were harvested at 1 h postmortem for all treatments or at 1 and 6 h postmortem for the control carcasses. Fillets were sampled for pH and ratio of inosine to adenosine (R-value) and were then individually quick frozen (IQF) or aged on ice (AOI) until 24 h postmortem. Color was measured in the AOI fillets at 24 h postmortem. All fillets were then cooked and evaluated for Allo-Kramer shear value. The Ar treatment accelerated the normal pH decline, whereas the ES and AR + ES treatments yielded even lower pH values at 1 h postmortem. The Ar + ES treatment had a greater R-value than the ES treatment, which was greater than either the Ar or 1-h controls, which, in turn, were not different from each other. The ES treatment had the lowest L* value, and ES, Ar, and Ar + ES produced significantly higher a* values than the 1-h controls. For the IQF fillets, the ES and Ar + ES treatments were not different in shear value but were lower than Ar, which was lower than the 1-h controls. The same was true for the AOI fillets except that the ES and the Ar treatments were not different. These results indicated that although ES and Ar had rigor-accelerating and tenderizing effects, ES seemed to be more effective than Ar; there was little enhancement when Ar was added to the ES treatment and fillets were deboned at 1 h postmortem.

  9. Biclustering via optimal re-ordering of data matrices in systems biology: rigorous methods and comparative studies

    Directory of Open Access Journals (Sweden)

    Feng Xiao-Jiang

    2008-10-01

    Full Text Available Abstract Background The analysis of large-scale data sets via clustering techniques is utilized in a number of applications. Biclustering in particular has emerged as an important problem in the analysis of gene expression data since genes may only jointly respond over a subset of conditions. Biclustering algorithms also have important applications in sample classification where, for instance, tissue samples can be classified as cancerous or normal. Many of the methods for biclustering, and clustering algorithms in general, utilize simplified models or heuristic strategies for identifying the "best" grouping of elements according to some metric and cluster definition and thus result in suboptimal clusters. Results In this article, we present a rigorous approach to biclustering, OREO, which is based on the Optimal RE-Ordering of the rows and columns of a data matrix so as to globally minimize the dissimilarity metric. The physical permutations of the rows and columns of the data matrix can be modeled as either a network flow problem or a traveling salesman problem. Cluster boundaries in one dimension are used to partition and re-order the other dimensions of the corresponding submatrices to generate biclusters. The performance of OREO is tested on (a metabolite concentration data, (b an image reconstruction matrix, (c synthetic data with implanted biclusters, and gene expression data for (d colon cancer data, (e breast cancer data, as well as (f yeast segregant data to validate the ability of the proposed method and compare it to existing biclustering and clustering methods. Conclusion We demonstrate that this rigorous global optimization method for biclustering produces clusters with more insightful groupings of similar entities, such as genes or metabolites sharing common functions, than other clustering and biclustering algorithms and can reconstruct underlying fundamental patterns in the data for several distinct sets of data matrices arising

  10. Heterogeneous nucleation on convex spherical substrate surfaces: A rigorous thermodynamic formulation of Fletcher's classical model and the new perspectives derived.

    Science.gov (United States)

    Qian, Ma; Ma, Jie

    2009-06-07

    Fletcher's spherical substrate model [J. Chem. Phys. 29, 572 (1958)] is a basic model for understanding the heterogeneous nucleation phenomena in nature. However, a rigorous thermodynamic formulation of the model has been missing due to the significant complexities involved. This has not only left the classical model deficient but also likely obscured its other important features, which would otherwise have helped to better understand and control heterogeneous nucleation on spherical substrates. This work presents a rigorous thermodynamic formulation of Fletcher's model using a novel analytical approach and discusses the new perspectives derived. In particular, it is shown that the use of an intermediate variable, a selected geometrical angle or pseudocontact angle between the embryo and spherical substrate, revealed extraordinary similarities between the first derivatives of the free energy change with respect to embryo radius for nucleation on spherical and flat substrates. Enlightened by the discovery, it was found that there exists a local maximum in the difference between the equivalent contact angles for nucleation on spherical and flat substrates due to the existence of a local maximum in the difference between the shape factors for nucleation on spherical and flat substrate surfaces. This helps to understand the complexity of the heterogeneous nucleation phenomena in a practical system. Also, it was found that the unfavorable size effect occurs primarily when R<5r( *) (R: radius of substrate and r( *): critical embryo radius) and diminishes rapidly with increasing value of R/r( *) beyond R/r( *)=5. This finding provides a baseline for controlling the size effects in heterogeneous nucleation.

  11. Interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations.

    Science.gov (United States)

    Simic, Vladimir

    2016-06-01

    As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    Science.gov (United States)

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  13. Naturalising Representational Content

    Science.gov (United States)

    Shea, Nicholas

    2014-01-01

    This paper sets out a view about the explanatory role of representational content and advocates one approach to naturalising content – to giving a naturalistic account of what makes an entity a representation and in virtue of what it has the content it does. It argues for pluralism about the metaphysics of content and suggests that a good strategy is to ask the content question with respect to a variety of predictively successful information processing models in experimental psychology and cognitive neuroscience; and hence that data from psychology and cognitive neuroscience should play a greater role in theorising about the nature of content. Finally, the contours of the view are illustrated by drawing out and defending a surprising consequence: that individuation of vehicles of content is partly externalist. PMID:24563661

  14. Plasma catecholamine content using radioenzymatic assay

    International Nuclear Information System (INIS)

    Minami, Masaru; Togashi, Hiroko; Koike, Yuichi; Shimamura, Keiichi; Yamazaki, Noriko

    1980-01-01

    Catecholamine (CA) contents in blood plasma of spontaneously hypertensive rats (SHR) and human blood plasma were measured by radioenzymatic assay (REA) and trihydroxyindol (THI) fluorescent method using high performance liquid chromatography (HPLC), and both measurement methods were compared. The standard curve of REA showed a good linear relationship between total CA contents and separated CA contents. Though there was a danger of exposure to β-ray when REA was performed, this method was useful for measurement of CA contents in blood of small animals and small quantity of blood because CA content of only 50 μg of blood plasma could be measured by this method. Norepinephrine (NE) and epinephrine (E) contents in men with normal blood pressure measured by REA was 250 +- 61 pg/ml and 37 +- 22 pg/ml, respectively. NE and E contents in patients with mild hypertension were 460 +- 128 pg/ml and 50 +- 20 pg/ml, respectively. There was not a significant difference between NE and E contents in men with normal blood pressure and those in patients with mild hypertension. Total CA content in blood plasma of SHR killed by decapitation was 5,000 +- 1,131 pg/ml, which was 5 times NE and E contents in blood plasma obtained from femoral vein of anesthetized SHR (816 +- 215 pg/ml and 209 +- 44 pg/ml). Total CA content in the same sample was measured by REA and HPLC. As a result, total CA content measured by REA was higher than that measured by HPLC, but there was a good relationship between total CA content measured by REA and that measured by HPLC. NE content in men with normal blood pressure measured by HPLC was elevated significantly according to an increase in their age, but this tendency was not observed in patients with hypertension. (Tsunoda, M.)

  15. 教學原理教科書共通內容之研究:國小職前教師專業標準觀點The Study of Common Content on Principles of Teaching Textbooks: Perspective of Professional Standards for Preservice Elementary School Teachers

    Directory of Open Access Journals (Sweden)

    劉唯玉Wei-Yu Liu

    2010-03-01

    Full Text Available 教育學程之課程長久以來有教育專業科目間聯繫不足,形成各科孤立學習的情況;以及教師教學專業自主,同樣的學科在不同教師的教導下,教學內容可能大不相同,因此無法預知學生在選擇各教育專業課程學習後,其教育專業能力能否達到職前教師專業標準。上述現象皆不利於保持或提升職前師資培育之品質。本研究比較中華民國師範教育學會所發展之「國民小學教師專業標準」及東華大學花師教育學院所發展之「國民小學職前教師專業準則」,得知國民小學職前師資教師專業標準之內涵。其次,分析中英文教學原理相關書籍,找出教學原理之九大共通主題內容,以及其應達到之職前教師專業標準。本研究結果將有助於國小師資培育學程「教學原理」共通內容之規劃與實施,提升師資培育課程之品質。 Education Program courses have long been complained for lack of relationship and isolated branches of learning; and for the reason of teachers’ professional autonomy, different teachers teach the same subject may have different content. It is hard to predict whether the educational expertise can achieve pre-service teacher’s professional standards or not. The above phenomenon is unfavorable to maintain or improve the quality of pre-service teacher education. This study finds the “Professional Standards for Preservice Elementary Teachers” by analyzing the differences between “Professional Standards for Elementary Teachers” and the “Professional Principles for Preservice Elementary Teachers”. By comparing the common subjects among books of Chinese and English versions on Principles of Teaching and its relationship to the Professional Standards for Preservice Elementary Teachers, the researchers suggest a common teaching goal and the course content for Principles of Teaching, which would be helpful in

  16. Validity evidence based on test content.

    Science.gov (United States)

    Sireci, Stephen; Faulkner-Bond, Molly

    2014-01-01

    Validity evidence based on test content is one of the five forms of validity evidence stipulated in the Standards for Educational and Psychological Testing developed by the American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. In this paper, we describe the logic and theory underlying such evidence and describe traditional and modern methods for gathering and analyzing content validity data. A comprehensive review of the literature and of the aforementioned Standards is presented. For educational tests and other assessments targeting knowledge and skill possessed by examinees, validity evidence based on test content is necessary for building a validity argument to support the use of a test for a particular purpose. By following the methods described in this article, practitioners have a wide arsenal of tools available for determining how well the content of an assessment is congruent with and appropriate for the specific testing purposes.

  17. About the standard solar model

    International Nuclear Information System (INIS)

    Cahen, S.

    1986-07-01

    A discussion of the still controversial solar helium content is presented, based on a comparison of recent standard solar models. Our last model yields an helium mass fraction ∼0.276, 6.4 SNU on 37 Cl and 126 SNU on 71 Ga

  18. Sizing up State Standards, 2008

    Science.gov (United States)

    American Federation of Teachers (NJ), 2008

    2008-01-01

    The members at the American Federation of Teachers examined each state's and Washington, D.C.'s content standards documents to determine whether or not they contain enough information about what students should learn to provide the basis for coherent curricula and assessments. There is no perfect formula for this; they made a series of judgment…

  19. System of HPC content archiving

    Science.gov (United States)

    Bogdanov, A.; Ivashchenko, A.

    2017-12-01

    This work is aimed to develop a system, that will effectively solve the problem of storing and analyzing files containing text data, by using modern software development tools, techniques and approaches. The main challenge of storing a large number of text documents defined at the problem formulation stage, have to be resolved with such functionality as full text search and document clustering depends on their contents. Main system features could be described with notions of distributed multilevel architecture, flexibility and interchangeability of components, achieved through the standard functionality incapsulation in independent executable modules.

  20. Non-destructive determination of moisture content and micro-fibril angle of wood using a poly-chromatic X-ray beam theoretical and experimental approach

    International Nuclear Information System (INIS)

    Baettig, R.

    2005-07-01

    Non-destructive determination of moisture content and micro-fibril angle are important stakes for the sciences of the wood because these two parameters influence strongly the macroscopic behavior of the wood. For example, the shrinkage, the mechanical properties, the thermal and acoustic conductivity are dependent on the moisture content and their anisotropic character is largely governed by the micro-fibril angle. We used the light difference between X-ray mass attenuation coefficient for the water and for the wood in transmission. Regrettably, the results show that this difference between X-ray mass attenuation coefficient is insufficient to allow the precise measurement of the moisture content.In spite of this, the coherent scattering shows sensitive effects. So, by using a poly-energetic beam and a spectrometric system, we were able to discriminate between the crystalline constituent (cellulose) of the amorphous constituent (water) in a sample of wet wood, because for a given angle these phases scatter in different energy. Besides, the device created allowed us to study the crystalline phase of the wood. We were able to confront experimental profiles of diffraction with theoretical profiles of diffraction, obtained by means of a rigorous simulation, in the objective to estimate the average micro-fibril angle and its standard deviation. (author)