WorldWideScience

Sample records for rigorous methodological quality

  1. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  2. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  3. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    Science.gov (United States)

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  4. Trends in Methodological Rigor in Intervention Research Published in School Psychology Journals

    Science.gov (United States)

    Burns, Matthew K.; Klingbeil, David A.; Ysseldyke, James E.; Petersen-Brown, Shawna

    2012-01-01

    Methodological rigor in intervention research is important for documenting evidence-based practices and has been a recent focus in legislation, including the No Child Left Behind Act. The current study examined the methodological rigor of intervention research in four school psychology journals since the 1960s. Intervention research has increased…

  5. Response to Ridgeway, Dunston, and Qian: On Methodological Rigor: Has Rigor Mortis Set In?

    Science.gov (United States)

    Baldwin, R. Scott; Vaughn, Sharon

    1993-01-01

    Responds to an article in the same issue of the journal presenting a meta-analysis of reading research. Expresses concern that the authors' conclusions will promote a slavish adherence to a methodology and a rigidity of thought that reading researchers can ill afford. (RS)

  6. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation.

    Science.gov (United States)

    Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A

    2015-01-08

    science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.

  7. Methodological Challenges in Sustainability Science: A Call for Method Plurality, Procedural Rigor and Longitudinal Research

    Directory of Open Access Journals (Sweden)

    Henrik von Wehrden

    2017-02-01

    Full Text Available Sustainability science encompasses a unique field that is defined through its purpose, the problem it addresses, and its solution-oriented agenda. However, this orientation creates significant methodological challenges. In this discussion paper, we conceptualize sustainability problems as wicked problems to tease out the key challenges that sustainability science is facing if scientists intend to deliver on its solution-oriented agenda. Building on the available literature, we discuss three aspects that demand increased attention for advancing sustainability science: 1 methods with higher diversity and complementarity are needed to increase the chance of deriving solutions to the unique aspects of wicked problems; for instance, mixed methods approaches are potentially better suited to allow for an approximation of solutions, since they cover wider arrays of knowledge; 2 methodologies capable of dealing with wicked problems demand strict procedural and ethical guidelines, in order to ensure their integration potential; for example, learning from solution implementation in different contexts requires increased comparability between research approaches while carefully addressing issues of legitimacy and credibility; and 3 approaches are needed that allow for longitudinal research, since wicked problems are continuous and solutions can only be diagnosed in retrospect; for example, complex dynamics of wicked problems play out across temporal patterns that are not necessarily aligned with the common timeframe of participatory sustainability research. Taken together, we call for plurality in methodologies, emphasizing procedural rigor and the necessity of continuous research to effectively addressing wicked problems as well as methodological challenges in sustainability science.

  8. Standards and Methodological Rigor in Pulmonary Arterial Hypertension Preclinical and Translational Research.

    Science.gov (United States)

    Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien

    2018-03-30

    Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from

  9. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  10. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    Science.gov (United States)

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Quality of nuchal translucency measurements correlates with broader aspects of program rigor and culture of excellence.

    Science.gov (United States)

    Evans, Mark I; Krantz, David A; Hallahan, Terrence; Sherwin, John; Britt, David W

    2013-01-01

    To determine if nuchal translucency (NT) quality correlates with the extent to which clinics vary in rigor and quality control. We correlated NT performance quality (bias and precision) of 246,000 patients with two alternative measures of clinic culture - % of cases for whom nasal bone (NB) measurements were performed and % of requisitions correctly filled for race-ethnicity and weight. When requisition errors occurred in 5% (33%), the curve lowered to 0.93 MoM (p 90%, MoM was 0.99 compared to those quality exists independent of individual variation in NT quality, and two divergent indices of program rigor are associated with NT quality. Quality control must be program wide, and to effect continued improvement in the quality of NT results across time, the cultures of clinics must become a target for intervention. Copyright © 2013 S. Karger AG, Basel.

  12. Disciplining Bioethics: Towards a Standard of Methodological Rigor in Bioethics Research

    Science.gov (United States)

    Adler, Daniel; Shaul, Randi Zlotnik

    2012-01-01

    Contemporary bioethics research is often described as multi- or interdisciplinary. Disciplines are characterized, in part, by their methods. Thus, when bioethics research draws on a variety of methods, it crosses disciplinary boundaries. Yet each discipline has its own standard of rigor—so when multiple disciplinary perspectives are considered, what constitutes rigor? This question has received inadequate attention, as there is considerable disagreement regarding the disciplinary status of bioethics. This disagreement has presented five challenges to bioethics research. Addressing them requires consideration of the main types of cross-disciplinary research, and consideration of proposals aiming to ensure rigor in bioethics research. PMID:22686634

  13. Quality properties of pre- and post-rigor beef muscle after interventions with high frequency ultrasound.

    Science.gov (United States)

    Sikes, Anita L; Mawson, Raymond; Stark, Janet; Warner, Robyn

    2014-11-01

    The delivery of a consistent quality product to the consumer is vitally important for the food industry. The aim of this study was to investigate the potential for using high frequency ultrasound applied to pre- and post-rigor beef muscle on the metabolism and subsequent quality. High frequency ultrasound (600kHz at 48kPa and 65kPa acoustic pressure) applied to post-rigor beef striploin steaks resulted in no significant effect on the texture (peak force value) of cooked steaks as measured by a Tenderometer. There was no added benefit of ultrasound treatment above that of the normal ageing process after ageing of the steaks for 7days at 4°C. Ultrasound treatment of post-rigor beef steaks resulted in a darkening of fresh steaks but after ageing for 7days at 4°C, the ultrasound-treated steaks were similar in colour to that of the aged, untreated steaks. High frequency ultrasound (2MHz at 48kPa acoustic pressure) applied to pre-rigor beef neck muscle had no effect on the pH, but the calculated exhaustion factor suggested that there was some effect on metabolism and actin-myosin interaction. However, the resultant texture of cooked, ultrasound-treated muscle was lower in tenderness compared to the control sample. After ageing for 3weeks at 0°C, the ultrasound-treated samples had the same peak force value as the control. High frequency ultrasound had no significant effect on the colour parameters of pre-rigor beef neck muscle. This proof-of-concept study showed no effect of ultrasound on quality but did indicate that the application of high frequency ultrasound to pre-rigor beef muscle shows potential for modifying ATP turnover and further investigation is warranted. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  14. Total Data Quality Management: A Study of Bridging Rigor and Relevance

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.; Boelens, R.; Middel, H.G.A.; Louissen, K.

    2007-01-01

    Ensuring data quality is of crucial importance to organizations. The Total Data Quality Management (TDQM) theory provides a methodology to ensure data quality. Although well researched, the TDQM methodology is not easy to apply. In the case of Honeywell Emmen, we found that the application of the

  15. Assessment of the Methodological Rigor of Case Studies in the Field of Management Accounting Published in Journals in Brazil

    Directory of Open Access Journals (Sweden)

    Kelly Cristina Mucio Marques

    2015-04-01

    Full Text Available This study aims to assess the methodological rigor of case studies in management accounting published in Brazilian journals. The study is descriptive. The data were collected using documentary research and content analysis, and 180 papers published from 2008 to 2012 in accounting journals rated as A2, B1, and B2 that were classified as case studies were selected. Based on the literature, we established a set of 15 criteria that we expected to be identified (either explicitly or implicitly in the case studies to classify those case studies as appropriate from the standpoint of methodological rigor. These criteria were partially met by the papers analyzed. The aspects less aligned with those proposed in the literature were the following: little emphasis on justifying the need to understand phenomena in context; lack of explanation of the reason for choosing the case study strategy; the predominant use of questions that do not enable deeper analysis; many studies based on only one source of evidence; little use of data and information triangulation; little emphasis on the data collection method; a high number of cases in which confusion between case study as a research strategy and as data collection method were detected; a low number of papers reporting the method of data analysis; few reports on a study's contributions; and a minority highlighting the issues requiring further research. In conclusion, the method used to apply case studies to management accounting must be improved because few studies showed rigorous application of the procedures that this strategy requires.

  16. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    Science.gov (United States)

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  17. Study designs for identifying risk compensation behavior among users of biomedical HIV prevention technologies: balancing methodological rigor and research ethics.

    Science.gov (United States)

    Underhill, Kristen

    2013-10-01

    The growing evidence base for biomedical HIV prevention interventions - such as oral pre-exposure prophylaxis, microbicides, male circumcision, treatment as prevention, and eventually prevention vaccines - has given rise to concerns about the ways in which users of these biomedical products may adjust their HIV risk behaviors based on the perception that they are prevented from infection. Known as risk compensation, this behavioral adjustment draws on the theory of "risk homeostasis," which has previously been applied to phenomena as diverse as Lyme disease vaccination, insurance mandates, and automobile safety. Little rigorous evidence exists to answer risk compensation concerns in the biomedical HIV prevention literature, in part because the field has not systematically evaluated the study designs available for testing these behaviors. The goals of this Commentary are to explain the origins of risk compensation behavior in risk homeostasis theory, to reframe risk compensation as a testable response to the perception of reduced risk, and to assess the methodological rigor and ethical justification of study designs aiming to isolate risk compensation responses. Although the most rigorous methodological designs for assessing risk compensation behavior may be unavailable due to ethical flaws, several strategies can help investigators identify potential risk compensation behavior during Phase II, Phase III, and Phase IV testing of new technologies. Where concerns arise regarding risk compensation behavior, empirical evidence about the incidence, types, and extent of these behavioral changes can illuminate opportunities to better support the users of new HIV prevention strategies. This Commentary concludes by suggesting a new way to conceptualize risk compensation behavior in the HIV prevention context. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  19. Systematic review of communication partner training in aphasia: methodological quality.

    Science.gov (United States)

    Cherney, Leora R; Simmons-Mackie, Nina; Raymer, Anastasia; Armstrong, Elizabeth; Holland, Audrey

    2013-10-01

    Twenty-three studies identified from a previous systematic review examining the effects of communication partner training on persons with aphasia and their communication partners were evaluated for methodological quality. Two reviewers rated the studies on defined methodological quality criteria relevant to each study design. There were 11 group studies, seven single-subject participant design studies, and five qualitative studies. Quality scores were derived for each study. The mean inter-rater reliability of scores for each study design ranged from 85-93%, with Cohen's Kappa indicating substantial agreement between raters. Methodological quality of research on communication partner training in aphasia was highly varied. Overall, group studies employed the least rigorous methodology as compared to single subject and qualitative research. Only two of 11 group studies complied with more than half of the quality criteria. No group studies reported therapist blinding and only one group study reported participant blinding. Across all types of studies, the criterion of treatment fidelity was most commonly omitted. Failure to explicitly report certain methodological quality criteria may account for low ratings. Using methodological rating scales specific to the type of study design may help improve the methodological quality of aphasia treatment studies, including those on communication partner training.

  20. Methodological quality and scientific impact of quantitative nursing education research over 18 months.

    Science.gov (United States)

    Yucha, Carolyn B; Schneider, Barbara St Pierre; Smyer, Tish; Kowalski, Susan; Stowers, Eva

    2011-01-01

    The methodological quality of nursing education research has not been rigorously studied. The purpose of this study was to evaluate the methodological quality and scientific impact of nursing education research reports. The methodological quality of 133 quantitative nursing education research articles published between July 2006 and December 2007 was evaluated using the Medical Education Research Study Quality Instrument (MERSQI).The mean (+/- SD) MERSQI score was 9.8 +/- 2.2. It correlated (p nursing literature and those reported for the medical literature, coupled with the association with citation counts, suggest that the MERSQI is an appropriate instrument to evaluate the quality of nursing education research.

  1. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    Science.gov (United States)

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (Prigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (Prigor fillets (37.8 ± 0.8) and had significantly lower (Prigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  2. Effects of Pre and Post-Rigor Marinade Injection on Some Quality Parameters of Longissimus Dorsi Muscles

    Science.gov (United States)

    Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem

    2018-01-01

    Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282

  3. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    Science.gov (United States)

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  4. A rigorous approach to investigating common assumptions about disease transmission: Process algebra as an emerging modelling methodology for epidemiology.

    Science.gov (United States)

    McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron

    2011-03-01

    Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.

  5. Methodological quality of guidelines in gastroenterology.

    Science.gov (United States)

    Malheiro, Rui; de Monteiro-Soares, Matilde; Hassan, Cesare; Dinis-Ribeiro, Mário

    2014-06-01

    Clinical guidelines are a common feature in modern endoscopy practice and they are being produced faster than ever. However, their methodological quality is rarely assessed. This study evaluated the methodological quality of current clinical guidelines in the field of gastroenterology, with an emphasis on endoscopy. Practice guidelines published by the American College of Gastroenterology (ACG), American Gastroenterological Association (AGA), American Society for Gastrointestinal Endoscopy (ASGE), European Society of Gastrointestinal Endoscopy (ESGE), British Society of Gastroenterology (BSG), National Institute for Health and Care Excellence (NICE), and the Scottish Intercollegiate Guidelines Network (SIGN) were searched between September and October 2012 and evaluated using the AGREE II (Appraisal of Guidelines for Research and Evaluation) instrument (23 items, scores 1 - 7 for each item; higher scores mean better quality). A total of 100 guidelines were assessed. The mean number of items scoring 6 or 7 per guideline was 9.2 (out of 23 items). Overall, 99 % of guidelines failed to include the target population in the development process, and 96 % did not report facilitators and barriers to guideline application. In addition, 86 % did not include advice or tools, and 94 % did not present monitoring or auditing criteria. The global methodological quality of clinical guidelines in the field of gastroenterology is poor, particularly regarding involvement of the target population in the development of guidelines and in the provision of clear suggestions to practitioners. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Q methodology, risk training and quality management.

    Science.gov (United States)

    McKeown, M; Hinks, M; Stowell-Smith, M; Mercer, D; Forster, J

    1999-01-01

    The results of a Q methodological study of professional understandings of the notion of risk in mental health services within the UK are discussed in relation to the relevance for staff training and quality assurance. The study attempted to access the diversity of understandings of risk issues amongst a multi-professional group of staff (n = 60) attending inter-agency risk training workshops in 1998. Q methodology is presented as both an appropriate means for such inquiry and as a novel experiential technique for training purposes. A tentative argument is advanced that the qualitative accounts generated by Q research could assist in systematic reviews of quality, complementing the singularly quantitative approaches typically represented in the audit process.

  7. Wedding Rigorous Scientific Methodology and Ancient Herbal Wisdom to Benefit Cancer Patients: The Development of PHY906.

    Science.gov (United States)

    Chu, Edward

    2018-02-15

    Our research group has extensively characterized the preclinical and clinical activities of PHY906, a traditional Chinese herbal medicine, as a modulator of irinotecan-based chemotherapy for the treatment of colorectal cancer. This article reviews the critical issues of quality control and standardization of PHY906 and highlights the importance of high-quality material for the conduct of preclinical and clinical studies. Studies to investigate the potential biological mechanisms of action using a systems biology approach play a pivotal role in providing the preclinical rationale to move forward with clinical studies. For early-phase clinical studies, translational biomarkers should be incorporated to characterize the biological effects of the herbal medicine. These biomarkers include tumor mutational load, cytokine/chemokine expression, metabolomic profiling, and the presence of key herbal metabolites. Sophisticated bioinformatic approaches are critical for mining the data and identifying those biomarkers that can define the subset of patients who will benefit from PHY906 or any other herbal medicine, in terms of reduced treatment toxicity, improved quality of life, and/or enhanced clinical activity of treatment.

  8. Urbanism & urban qualities New data and methodologies

    DEFF Research Database (Denmark)

    2009-01-01

    The interest in urban spaces and their qualities has become stronger in recent years. A substantial volume of projects aims to create attractive urban spaces reasons of Sustainability, Quality of Life and urban vitality. But who actually uses the urban spaces, which urban spaces are used? How do...... they use them? What characterizes the good urban space? And how and by who is it evaluated? How is a better co-operation between urban space researchers, decision makers and users established? Is it the right urban spaces which receive investments? How can research optimize the basis for decisions......?   Proceedings from the conference "Urbanism & urban qualities - new data & methodologies" held 24th of June 2009 at The Royal Danish Academy of Fine Arts in Copenhagen....

  9. The development of a checklist to enhance methodological quality in intervention programs

    Directory of Open Access Journals (Sweden)

    Salvador Chacón-Moscoso

    2016-11-01

    Full Text Available The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a systematize and summarize the available literature about methodological quality in primary studies; (b propose a specific, parsimonious, 12-item checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c present an inter-coder reliability study for the resulting 12 items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed.

  10. Effect of pre-rigor stretch and various constant temperatures on the rate of post-mortem pH fall, rigor mortis and some quality traits of excised porcine biceps femoris muscle strips.

    Science.gov (United States)

    Vada-Kovács, M

    1996-01-01

    Porcine biceps femoris strips of 10 cm original length were stretched by 50% and fixed within 1 hr post mortem then subjected to temperatures of 4 °, 15 ° or 36 °C until they attained their ultimate pH. Unrestrained control muscle strips, which were left to shorten freely, were similarly treated. Post-mortem metabolism (pH, R-value) and shortening were recorded; thereafter ultimate meat quality traits (pH, lightness, extraction and swelling of myofibrils) were determined. The rate of pH fall at 36 °C, as well as ATP breakdown at 36 and 4 °C, were significantly reduced by pre-rigor stretch. The relationship between R-value and pH indicated cold shortening at 4 °C. Myofibrils isolated from pre-rigor stretched muscle strips kept at 36 °C showed the most severe reduction of hydration capacity, while paleness remained below extreme values. However, pre-rigor stretched myofibrils - when stored at 4 °C - proved to be superior to shortened ones in their extractability and swelling.

  11. A Methodology for Quality Problems Diagnosis in SMEs

    OpenAIRE

    Humberto N. Teixeira; Isabel S. Lopes; Sérgio D. Sousa

    2012-01-01

    This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which include...

  12. Skin Bleaching and Dermatologic Health of African and Afro-Caribbean Populations in the US: New Directions for Methodologically Rigorous, Multidisciplinary, and Culturally Sensitive Research.

    Science.gov (United States)

    Benn, Emma K T; Alexis, Andrew; Mohamed, Nihal; Wang, Yan-Hong; Khan, Ikhlas A; Liu, Bian

    2016-12-01

    Skin-bleaching practices, such as using skin creams and soaps to achieve a lighter skin tone, are common throughout the world and are triggered by cosmetic reasons that oftentimes have deep historical, economic, sociocultural, and psychosocial roots. Exposure to chemicals in the bleaching products, notably, mercury (Hg), hydroquinone, and steroids, has been associated with a variety of adverse health effects, such as Hg poisoning and exogenous ochronosis. In New York City (NYC), skin care product use has been identified as an important route of Hg exposure, especially among Caribbean-born blacks and Dominicans. However, surprisingly sparse information is available on the epidemiology of the health impacts of skin-bleaching practices among these populations. We highlight the dearth of large-scale, comprehensive, community-based, clinical, and translational research in this area, especially the limited skin-bleaching-related research among non-White populations in the US. We offer five new research directions, including investigating the known and under-studied health consequences among populations for which the skin bleach practice is newly emerging at an alarming rate using innovative laboratory and statistical methods. We call for conducting methodologically rigorous, multidisciplinary, and culturally sensitive research in order to provide insights into the root and the epidemiological status of the practice and provide evidence of exposure-outcome associations, with an ultimate goal of developing potential intervention strategies to reduce the health burdens of skin-bleaching practice.

  13. Air quality estimation by computational intelligence methodologies

    Directory of Open Access Journals (Sweden)

    Ćirić Ivan T.

    2012-01-01

    Full Text Available The subject of this study is to compare different computational intelligence methodologies based on artificial neural networks used for forecasting an air quality parameter - the emission of CO2, in the city of Niš. Firstly, inputs of the CO2 emission estimator are analyzed and their measurement is explained. It is known that the traffic is the single largest emitter of CO2 in Europe. Therefore, a proper treatment of this component of pollution is very important for precise estimation of emission levels. With this in mind, measurements of traffic frequency and CO2 concentration were carried out at critical intersections in the city, as well as the monitoring of a vehicle direction at the crossroad. Finally, based on experimental data, different soft computing estimators were developed, such as feed forward neural network, recurrent neural network, and hybrid neuro-fuzzy estimator of CO2 emission levels. Test data for some characteristic cases presented at the end of the paper shows good agreement of developed estimator outputs with experimental data. Presented results are a true indicator of the implemented method usability. [Projekat Ministarstva nauke Republike Srbije, br. III42008-2/2011: Evaluation of Energy Performances and br. TR35016/2011: Indoor Environment Quality of Educational Buildings in Serbia with Impact to Health and Research of MHD Flows around the Bodies, in the Tip Clearances and Channels and Application in the MHD Pumps Development

  14. Quality Assurance in Trichiasis Surgery: a methodology

    Science.gov (United States)

    Buchan, John C; Limburg, Hans; Burton, Matthew J

    2013-01-01

    SUMMARY Trachoma remains a significant cause of blindness in many parts of the world. The major route to blindness involves upper lid entropion leading to trachomatous trichiasis (TT) which promotes progressive corneal opacification. The provision of surgery to correct TT in the populations most severely affected is a major challenge for the global effort to eliminate Trachoma blindness by the year 2020. Most attention has been paid to increasing the quantity of TT surgery performed, and large numbers of non-doctor operators have been trained to this end. Surgical audit by those performing TT surgery is not a routine part of any national trachoma control programme, and no effective mechanism exists for identifying surgeons experiencing poor outcomes. We propose a methodology for surgical audit at the level of the individual surgeon based on Lot Quality Assurance. A set number of patients operated on previously for upper eyelid TT are examined to detect the recurrence of TT. The number of recurrent cases found will lead to categorisation of the TT surgeon to either “high recurrence” or “low recurrence” with reasonable confidence. The threshold of unacceptability can be set by individual programmes according to previous local studies of recurrence rates or those from similar settings. Identification of surgeons delivering unacceptably high levels of recurrent TT will guide managers on the need for remedial intervention such as re-training. PMID:20881027

  15. Quality assurance in trichiasis surgery: a methodology.

    Science.gov (United States)

    Buchan, John C; Limburg, Hans; Burton, Matthew J

    2011-03-01

    Trachoma remains a significant cause of blindness in many parts of the world. The major route to blindness involves upper lid entropion leading to trachomatous trichiasis (TT), which promotes progressive corneal opacification. The provision of surgery to correct TT in the populations most severely affected is a major challenge for the global effort to eliminate trachoma blindness by the year 2020. Most attention has focused on increasing the quantity of TT surgery performed, and large numbers of non-doctor operators have been trained to this end. Surgical audit by those performing TT surgery is not a routine part of any national trachoma control programme, and no effective mechanism exists for identifying surgeons experiencing poor outcomes. The authors propose a methodology for surgical audit at the level of the individual surgeon based on Lot Quality Assurance. A set number of patients operated on previously for upper eyelid TT are examined to detect the recurrence of TT. The number of recurrent cases found will lead to categorisation of the TT surgeon to either 'high recurrence' or 'low recurrence' with reasonable confidence. The threshold of unacceptability can be set by individual programmes according to previous local studies of recurrence rates or those from similar settings. Identification of surgeons delivering unacceptably high levels of recurrent TT will guide managers on the need for remedial intervention such as retraining.

  16. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    Science.gov (United States)

    Lexchin, J; Holbrook, A

    1994-07-01

    To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). Analytic study. All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion.

  17. The main aspects of methodology of quality management system

    Directory of Open Access Journals (Sweden)

    Smirnova E.K.

    2017-03-01

    Full Text Available this article describes the formation and development of quality management as an integrated system. The author considers the theory and methodology of quality management since the early XXth century to the present day and describes the main problems encountered in the process of quality management system, as well as the ways to overcome them.

  18. Methodology and Supporting Toolset Advancing Embedded Systems Quality

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Brewka, Lukasz Jerzy

    2013-01-01

    Software quality is of primary importance in the development of embedded systems that are often used in safety-critical applications. Moreover, as the life cycle of embedded products becomes increasingly tighter, productivity and quality are simultaneously required and closely interrelated towards...... delivering competitive products. In this context, the MODUS (Methodology and supporting toolset advancing embedded systems quality) project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. This paper...... will describe the MODUS project with focus on the technical methodologies that will be developed advancing embedded system quality....

  19. Dynamic benchmarking methodology for Quality Function Deployment

    NARCIS (Netherlands)

    Raharjo, H.; Brombacher, A.C.; Chai, K.H.; Bergman, B.

    2008-01-01

    A competitive advantage, generally, can be gained if a firm produces a product that not only addresses what the customer values most, but also performs better than its competitors in terms of quality, cost, and timeliness. However, these two factors, namely, the customer needs and competitorspsila

  20. Methodologies for defining quality of life

    Energy Technology Data Exchange (ETDEWEB)

    Glicken, J. [Ecological Planning and Toxicology, Inc., Albuquerque, NM (United States); Engi, D. [Sandia National Labs., Albuquerque, NM (United States)

    1996-10-10

    Quality of life as a concept has been used in many ways in the public policy arena. It can be used in summative evaluations to assess the impacts of policies or programs. Alternatively, it can be applied to formative evaluations to provide input to the formation of new policies. In short, it provides the context for the understanding needed to evaluate the results of choices that have been made in the public policy arena, or the potential of choices yet to be made. In either case, the public policy question revolves around the positive or negative impact the choice will have on quality of life, and the magnitude of that impact. This discussion will develop a conceptual framework that proposes that an assessment of quality of life is based on a comparison of expectations with experience. The framework defines four basic components from which these expectations arise: natural conditions, social conditions, the body, and the mind. Each one of these components is generally described, and associated with a general policy or rhetorical category which gives it its policy vocabulary--environmental quality, economic well-being, human health, and self-fulfillment.

  1. Meat quality and rigor mortis development in broiler chickens with gas-induced anoxia and postmortem electrical stimulation.

    Science.gov (United States)

    Sams, A R; Dzuik, C S

    1999-10-01

    This study was conducted to evaluate the combined rigor-accelerating effects of postmortem electrical stimulation (ES) and argon-induced anoxia (Ar) of broiler chickens. One hundred broilers were processed in the following treatments: untreated controls, ES, Ar, or Ar with ES (Ar + ES). Breast fillets were harvested at 1 h postmortem for all treatments or at 1 and 6 h postmortem for the control carcasses. Fillets were sampled for pH and ratio of inosine to adenosine (R-value) and were then individually quick frozen (IQF) or aged on ice (AOI) until 24 h postmortem. Color was measured in the AOI fillets at 24 h postmortem. All fillets were then cooked and evaluated for Allo-Kramer shear value. The Ar treatment accelerated the normal pH decline, whereas the ES and AR + ES treatments yielded even lower pH values at 1 h postmortem. The Ar + ES treatment had a greater R-value than the ES treatment, which was greater than either the Ar or 1-h controls, which, in turn, were not different from each other. The ES treatment had the lowest L* value, and ES, Ar, and Ar + ES produced significantly higher a* values than the 1-h controls. For the IQF fillets, the ES and Ar + ES treatments were not different in shear value but were lower than Ar, which was lower than the 1-h controls. The same was true for the AOI fillets except that the ES and the Ar treatments were not different. These results indicated that although ES and Ar had rigor-accelerating and tenderizing effects, ES seemed to be more effective than Ar; there was little enhancement when Ar was added to the ES treatment and fillets were deboned at 1 h postmortem.

  2. Methodological quality of systematic reviews on influenza vaccination.

    Science.gov (United States)

    Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas

    2014-03-26

    There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm

    2017-01-01

    are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer......This paper discusses methods for assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology...... to properly reveal the clinical value. The paper exemplifies the methodology using recent studies of Synthetic Aperture Sequential Beamforming tissue harmonic imaging....

  4. Methodological quality of systematic reviews addressing femoroacetabular impingement.

    Science.gov (United States)

    Kowalczuk, Marcin; Adamich, John; Simunovic, Nicole; Farrokhyar, Forough; Ayeni, Olufemi R

    2015-09-01

    As the body of literature on femoroacetabular impingement (FAI) continues to grow, clinicians turn to systematic reviews to remain current with the best available evidence. The quality of systematic reviews in the FAI literature is currently unknown. The goal of this study was to assess the quality of the reporting of systematic reviews addressing FAI over the last 11 years (2003-2014) and to identify the specific methodological shortcomings and strengths. A search of the electronic databases, MEDLINE, EMBASE and PubMed, was performed to identify relevant systematic reviews. Methodological quality was assessed by two reviewers using the revised assessment of multiple systematic reviews (R-AMSTAR) scoring tool. An intraclass correlation coefficient (ICC) with 95 % confidence intervals (CI) was used to determine agreement between reviewers on R-AMSTAR quality scores. A total of 22 systematic reviews were assessed for methodological quality. The mean consensus R-AMSTAR score across all studies was 26.7 out of 40.0, indicating fair methodological quality. An ICC of 0.931, 95 % CI 0.843-0.971 indicated excellent agreement between reviewers during the scoring process. The systematic reviews addressing FAI are generally of fair methodological quality. Use of tools such as the R-AMSTAR score or PRISMA guidelines while designing future systematic reviews can assist in eliminating methodological shortcomings identified in this review. These shortcomings need to be kept in mind by clinicians when applying the current literature to their patient populations and making treatment decisions. Systematic reviews of highest methodological quality should be used by clinicians when possible to answer clinical questions.

  5. Measuring the Quality of Publications : New Methodology and Case Study

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; van Groenendaal, W.J.H.

    2000-01-01

    n practice, it is important to evaluate the quality of research, in order to make decisions on tenure, funding, and so on. This article develops a methodology using citations to measure the quality of journals, proceedings, and book publishers. (Citations are also used by the Science and Social

  6. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    Science.gov (United States)

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  7. Methodology of quality improvement projects for the Texas Medicare population.

    Science.gov (United States)

    Pendergrass, P W; Abel, R L; Bing, M; Vaughn, R; McCauley, C

    1998-07-01

    The Texas Medical Foundation, the quality improvement organization for the state of Texas, develops local quality improvement projects for the Medicare population. These projects are developed as part of the Health Care Quality Improvement Program undertaken by the Health Care Financing Administration. The goal of a local quality improvement project is to collaborate with providers to identify and reduce the incidence of unintentional variations in the delivery of care that negatively impact outcomes. Two factors are critical to the success of a quality improvement project. First, as opposed to peer review that is based on implicit criteria, quality improvement must be based on explicit criteria. These criteria represent key steps in the delivery of care that have been shown to improve outcomes for a specific disease. Second, quality improvement must be performed in partnership with the health care community. As such, the health care community must play an integral role in the design and evaluation of a quality improvement project and in the design and implementation of the resulting quality improvement plan. Specifically, this article provides a historical perspective for the transition from peer review to quality improvement. It discusses key steps used in developing and implementing local quality improvement projects including topic selection, quality indicator development, collaborator recruitment, and measurement of performance/improvement. Two Texas Medical Foundation projects are described to highlight the current methodology and to illustrate the impact of quality improvement projects.

  8. Environmental risk assessment of water quality in harbor areas: a new methodology applied to European ports.

    Science.gov (United States)

    Gómez, Aina G; Ondiviela, Bárbara; Puente, Araceli; Juanes, José A

    2015-05-15

    This work presents a standard and unified procedure for assessment of environmental risks at the contaminant source level in port aquatic systems. Using this method, port managers and local authorities will be able to hierarchically classify environmental hazards and proceed with the most suitable management actions. This procedure combines rigorously selected parameters and indicators to estimate the environmental risk of each contaminant source based on its probability, consequences and vulnerability. The spatio-temporal variability of multiple stressors (agents) and receptors (endpoints) is taken into account to provide accurate estimations for application of precisely defined measures. The developed methodology is tested on a wide range of different scenarios via application in six European ports. The validation process confirms its usefulness, versatility and adaptability as a management tool for port water quality in Europe and worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Methodological Quality of Consensus Guidelines in Implant Dentistry.

    Science.gov (United States)

    Faggion, Clovis Mariano; Apaza, Karol; Ariza-Fritas, Tania; Málaga, Lilian; Giannakopoulos, Nikolaos Nikitas; Alarcón, Marco Antonio

    2017-01-01

    Consensus guidelines are useful to improve clinical decision making. Therefore, the methodological evaluation of these guidelines is of paramount importance. Low quality information may guide to inadequate or harmful clinical decisions. To evaluate the methodological quality of consensus guidelines published in implant dentistry using a validated methodological instrument. The six implant dentistry journals with impact factors were scrutinised for consensus guidelines related to implant dentistry. Two assessors independently selected consensus guidelines, and four assessors independently evaluated their methodological quality using the Appraisal of Guidelines for Research & Evaluation (AGREE) II instrument. Disagreements in the selection and evaluation of guidelines were resolved by consensus. First, the consensus guidelines were analysed alone. Then, systematic reviews conducted to support the guidelines were included in the analysis. Non-parametric statistics for dependent variables (Wilcoxon signed rank test) was used to compare both groups. Of 258 initially retrieved articles, 27 consensus guidelines were selected. Median scores in four domains (applicability, rigour of development, stakeholder involvement, and editorial independence), expressed as percentages of maximum possible domain scores, were below 50% (median, 26%, 30.70%, 41.70%, and 41.70%, respectively). The consensus guidelines and consensus guidelines + systematic reviews data sets could be compared for 19 guidelines, and the results showed significant improvements in all domain scores (p dentistry journals is needed. The findings of the present study may help researchers to better develop consensus guidelines in implant dentistry, which will improve the quality and trust of information needed to make proper clinical decisions.

  10. Development of Management Methodology for Engineering Production Quality

    Science.gov (United States)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  11. Using Quality Tools and Methodologies to Improve a Hospital's Quality Position.

    Science.gov (United States)

    Branco, Daniel; Wicks, Angela M; Visich, John K

    2017-01-01

    The authors identify the quality tools and methodologies most frequently used by quality-positioned hospitals versus nonquality hospitals. Northeastern U.S. hospitals in both groups received a brief, 12-question survey. The authors found that 93.75% of the quality hospitals and 81.25% of the nonquality hospitals used some form of process improvement methodologies. However, there were significant differences between the groups regarding the impact of quality improvement initiatives on patients. The findings indicate that in quality hospitals the use of quality improvement initiatives had a significantly greater positive impact on patient satisfaction and patient outcomes when compared to nonquality hospitals.

  12. Organ Donation European Quality System: ODEQUS project methodology.

    Science.gov (United States)

    Manyalich, M; Guasch, X; Gomez, M P; Páez, G; Teixeira, L

    2013-01-01

    Differences in the number of organ donors among hospitals cannot be explained only by the number of intensive care unit beds used or neurologic patients treated. The figures obtained are influenced by the organizational structure of the donation process and how efficient it is. The Organ Donation European Quality System (ODEQUS) is a 3-year project (from October 2010 to September 2013) co-financed by the European Agency for Health and Consumers (EAHC20091108) which aims to define a methodology to evaluate organ procurement performance at the hospital level. ODEQUS's specific objectives are to identify quality criteria and to develop quality indicators in three types of organ donation (after brain death, after cardiac death, and living donation). Those tools will be useful for hospitals' self-assessment as well as for developing an international auditing model. A consortium has been established involving 14 associated partners from Austria, Croatia, France, Germany, Italy, Poland, Portugal, Romania, Spain, Sweden, and the United Kingdom, as well as five collaborating partners from Greece, Hungary, Malta, Slovenia, and Turkey. The project has been established in three steps: 1) Design of a survey about the use of quality tools in a wide sample of European hospitals; 2) Development of quality criteria and quality indicators by the project experts. The main fields considered have been organizational structures, clinical procedures, and outcomes; and 3) Elaboration of an evaluation system to test the quality indicators in 11 European hospitals. Two types of training have been designed and performed: one concerns the development of quality criteria and quality indicators, whereas another is focused on how to use evaluation tools. Following this methodology, the project has so far identified 131 quality criteria and developed 31 quality indicators. Currently, the quality indicators are being tested in 11 selected hospitals. Copyright © 2013 Elsevier Inc. All rights

  13. ASSESSMENT OF QUALITY OF LIFE: PRESENT AND FUTURE METHODOLOGICAL CHALLENGES

    Directory of Open Access Journals (Sweden)

    Isabel Benítez

    2016-01-01

    Full Text Available The growing importance of quality of life in diverse domains, such as health, school performance and social participation, has led to the development of new conceptualisations and assessments of the construct. This diversity of perspectives brings about many benefits, but it also creates an obstacle for the formulation of a single unifying definition of the construct and, therefore, an agreed instrument or assessment framework. The aim of this study is to discuss the current methodological challenges in the measurement of quality of life. Firstly, we provide a brief description of the construct as defined in various areas, then we examine the new methodological developments and different applications. We also present an overview of the different possibilities for future developments in defining and measuring quality of life in national and international studies.

  14. A Quality-Driven Methodology for Information Systems Integration

    Directory of Open Access Journals (Sweden)

    Iyad Zikra

    2017-10-01

    Full Text Available Information systems integration is an essential instrument for organizations to attain advantage in today’s growing and fast changing business and technology landscapes. Integration solutions generate added value by combining the functionality and services of heterogeneous and diverse systems. Existing integration environments tend to rely heavily on technical, platform-dependent skills. Consequently, the solutions that they enable are not optimally aligned with the envisioned business goals of the organization. Furthermore, the gap between the goals and the solutions complicates the task of evaluating the quality of integration solutions. To address these challenges, we propose a quality-driven, model-driven methodology for designing and developing integration solutions. The methodology spans organizational and systems design details, providing a holistic view of the integration solution and its underlying business goals. A multi-view meta-model provides the basis for the integration design. Quality factors that affect various aspects of the integration solution guide and inform the progress of the methodology. An example business case is presented to demonstrate the application of the methodology.

  15. A methodology model for quality management in a general hospital.

    Science.gov (United States)

    Stern, Z; Naveh, E

    1997-01-01

    A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.

  16. QUALITY IMPROVEMENT IN MULTIRESPONSE EXPERIMENTS THROUGH ROBUST DESIGN METHODOLOGY

    Directory of Open Access Journals (Sweden)

    M. Shilpa

    2012-06-01

    Full Text Available Robust design methodology aims at reducing the variability in the product performance in the presence of noise factors. Experiments involving simultaneous optimization of more than one quality characteristic are known as multiresponse experiments which are used in the development and improvement of industrial processes and products. In this paper, robust design methodology is applied to optimize the process parameters during a particular operation of rotary driving shaft manufacturing process. The three important quality characteristics of the shaft considered here are of type Nominal-the-best, Smaller-the-better and Fraction defective. Simultaneous optimization of these responses is carried out by identifying the control parameters and conducting the experimentation using L9 orthogonal array.

  17. Methodology for Multileaf Collimator Quality Assurance in clinical conditions

    International Nuclear Information System (INIS)

    Diaz M, R. M.; Rodriguez Z, M.; Juarez D, A.; Romero R, R.

    2013-01-01

    Multileaf Collimators (MLCs) have become an important technological advance as part of clinical linear accelerators (linacs) for radiotherapy. Treatment planning and delivery were substantially modified after these devices. However, it was needed to develop Quality Assurance (QA) methodologies related to the performance of these developments. The most common methods for QA of MLC are made in basic conditions that hardly cover all possible difficulties in clinical practice. Diaz et. el. developed a methodology based upon volumetric detectors bidimensional arrays that can be extended to more demanding situations. In this work, the Auril methodology of Diaz et. al. was implemented to the irradiation with the linac gantry in horizontal position. A mathematical procedure was developed to ease the dosimetric centering of the device with the Auril centering tool. System calibration was made as in the typical Auril methodology. Patterns with leaf misplacements in known positions were irradiated. the method allowed the detection of leafs' misplacements with a minimum number of false positives. We concluded that Auril methodology can be applied in clinical conditions. (Author)

  18. The ABT methodology employment for VET of quality auditors

    Directory of Open Access Journals (Sweden)

    Liviu Moldovan

    2011-12-01

    Full Text Available This paper presents some achievements of the project entitled “Disseminating Open and Innovative Tools and Services for Vocational Education and Training in Quality Assurance” (acronym Do-IT financed by European Commission. The recent developments and results obtained during pilot testing of new pedagogical models and services, in Do-IT project, targeting engineering education in Romania are presented. This include the activity Based Training methodology (ABT for quality management system audit course according to ISO 19011 and ISO 9001 and evaluation of theoretical achievements with Student Response System (SRS.

  19. Classroom Talk for Rigorous Reading Comprehension Instruction

    Science.gov (United States)

    Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.

    2004-01-01

    This study examined the quality of classroom talk and its relation to academic rigor in reading-comprehension lessons. Additionally, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data for this study included 21 reading-comprehension lessons in several elementary and middle schools from…

  20. Methodology for stereoscopic motion-picture quality assessment

    Science.gov (United States)

    Voronov, Alexander; Vatolin, Dmitriy; Sumin, Denis; Napadovsky, Vyacheslav; Borisov, Alexey

    2013-03-01

    Creating and processing stereoscopic video imposes additional quality requirements related to view synchronization. In this work we propose a set of algorithms for detecting typical stereoscopic-video problems, which appear owing to imprecise setup of capture equipment or incorrect postprocessing. We developed a methodology for analyzing the quality of S3D motion pictures and for revealing their most problematic scenes. We then processed 10 modern stereo films, including Avatar, Resident Evil: Afterlife and Hugo, and analyzed changes in S3D-film quality over the years. This work presents real examples of common artifacts (color and sharpness mismatch, vertical disparity and excessive horizontal disparity) in the motion pictures we processed, as well as possible solutions for each problem. Our results enable improved quality assessment during the filming and postproduction stages.

  1. Measurement of Quality of Life I. A Methodological Framework

    Directory of Open Access Journals (Sweden)

    Soren Ventegodt

    2003-01-01

    Full Text Available Despite the widespread acceptance of quality of life (QOL as the ideal guideline in healthcare and clinical research, serious conceptual and methodological problems continue to plague this area. In an attempt to remedy this situation, we propose seven criteria that a quality-of-life concept must meet to provide a sound basis for investigation by questionnaire. The seven criteria or desiderata are: (1 an explicit definition of quality of life; (2 a coherent philosophy of human life from which the definition is derived; (3 a theory that operationalizes the philosophy by specifying unambiguous, nonoverlapping, and jointly exhaustive questionnaire items; (4 response alternatives that permit a fraction-scale interpretation; (5 technical checks of reproducibility; (6 meaningfulness to investigators, respondents, and users; and (7 an overall aesthetic appeal of the questionnaire. These criteria have guided the design of a validated 5-item generic, global quality-of-life questionnaire (QOL5, and a validated 317-item generic, global quality-of-life questionnaire (SEQOL, administered to a well-documented birth cohort of 7,400 Danes born in 1959�1961, as well as to a reference sample of 2,500 Danes. Presented in outline, the underlying integrative quality-of-life (IQOL theory is a meta-theory. To illustrate the seven criteria at work, we show the extent to which they are satisfied by one of the eight component theories. Next, two sample results of our investigation are presented: satisfaction with one's sex life has the expected covariation with one's quality of life, and so does mother's smoking during pregnancy, albeit to a much smaller extent. It is concluded that the methodological framework presented has proved helpful in designing a questionnaire that is capable of yielding acceptably valid and reliable measurements of global and generic quality of life.

  2. Methodological Quality Assessment of Meta-analyses in Endodontics.

    Science.gov (United States)

    Kattan, Sereen; Lee, Su-Min; Kohli, Meetu R; Setzer, Frank C; Karabucak, Bekir

    2018-01-01

    The objectives of this review were to assess the methodological quality of published meta-analyses related to endodontics using the assessment of multiple systematic reviews (AMSTAR) tool and to provide a follow-up to previously published reviews. Three electronic databases were searched for eligible studies according to the inclusion and exclusion criteria: Embase via Ovid, The Cochrane Library, and Scopus. The electronic search was amended by a hand search of 6 dental journals (International Endodontic Journal; Journal of Endodontics; Australian Endodontic Journal; Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology; Endodontics and Dental Traumatology; and Journal of Dental Research). The searches were conducted to include articles published after July 2009, and the deadline for inclusion of the meta-analyses was November 30, 2016. The AMSTAR assessment tool was used to evaluate the methodological quality of all included studies. A total of 36 reports of meta-analyses were included. The overall quality of the meta-analyses reports was found to be medium, with an estimated mean overall AMSTAR score of 7.25 (95% confidence interval, 6.59-7.90). The most poorly assessed areas were providing an a priori design, the assessment of the status of publication, and publication bias. In recent publications in the field of endodontics, the overall quality of the reported meta-analyses is medium according to AMSTAR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  3. Methodology of quality control for brachytherapy {sup 125}I seeds

    Energy Technology Data Exchange (ETDEWEB)

    Moura, Eduardo S.; Zeituni, Carlos A.; Manzoli, Jose E.; Rostelato, Maria Elisa C.M. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mail: esmoura@ipen.br

    2007-07-01

    This paper presents the methodology of quality control of {sup 125}I seeds used for brachytherapy. The {sup 125}I seeds are millimeter titanium capsules widely used in permanent implants of prostate cancer, allowing a high dose within the tumour and a low dose on the surrounding tissues, with very low harm to the other tissues. Besides, with this procedure, the patients have a low impotence rate and a small incidence of urinary incontinence. To meet the medical standards, an efficient quality control is necessary, showing values with the minimum uncertainness possible, concerning the seeds dimensions and their respective activities. The medical needles are used to insert the seeds inside the prostate. The needles used in brachytherapy have an internal diameter of 1.0 mm, so it is necessary {sup 125}I seeds with an external maximum diameter of 0.85 mm. For the seeds and the spacer positioning on the planning sheet, the seeds must have a length between 4.5 and 5.0 mm. The activities must not vary more than 5% in each batch of {sup 125}I seeds. For this methodology, we used two ionization chamber detectors and one caliper. In this paper, the methodology using one control batch with 75 seeds manufactured by GE Health care Ltd is presented. (author)

  4. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics-Three Decades of High-Quality, Technically-Rigorous Normalization

    Science.gov (United States)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high-quality, technically-rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards contain testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards are used to generate accurate, reliable, repeatable and complete data. Within Committee C28, users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, 50 standards since the Committee's founding in 1986. This paper provides a detailed retrospective of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of standards for advanced ceramics to demonstrate their practical applications.

  5. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics, Three Decades of High-quality, Technically-rigorous Normalization

    Science.gov (United States)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high quality, rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards provide accurate, reliable, repeatable and complete data. Within Committee C28 users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, nearly 50 standards since the Committees founding in 1986. This paper provides a retrospective review of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of advanced ceramics standards to demonstrate their practical applications.

  6. Residency Training: Quality improvement projects in neurology residency and fellowship: applying DMAIC methodology.

    Science.gov (United States)

    Kassardjian, Charles D; Williamson, Michelle L; van Buskirk, Dorothy J; Ernste, Floranne C; Hunderfund, Andrea N Leep

    2015-07-14

    Teaching quality improvement (QI) is a priority for residency and fellowship training programs. However, many medical trainees have had little exposure to QI methods. The purpose of this study is to review a rigorous and simple QI methodology (define, measure, analyze, improve, and control [DMAIC]) and demonstrate its use in a fellow-driven QI project aimed at reducing the number of delayed and canceled muscle biopsies at our institution. DMAIC was utilized. The project aim was to reduce the number of delayed muscle biopsies to 10% or less within 24 months. Baseline data were collected for 12 months. These data were analyzed to identify root causes for muscle biopsy delays and cancellations. Interventions were developed to address the most common root causes. Performance was then remeasured for 9 months. Baseline data were collected on 97 of 120 muscle biopsies during 2013. Twenty biopsies (20.6%) were delayed. The most common causes were scheduling too many tests on the same day and lack of fasting. Interventions aimed at patient education and biopsy scheduling were implemented. The effect was to reduce the number of delayed biopsies to 6.6% (6/91) over the next 9 months. Familiarity with QI methodologies such as DMAIC is helpful to ensure valid results and conclusions. Utilizing DMAIC, we were able to implement simple changes and significantly reduce the number of delayed muscle biopsies at our institution. © 2015 American Academy of Neurology.

  7. Non-Communicable Disease Clinical Practice Guidelines in Brazil: A Systematic Assessment of Methodological Quality and Transparency.

    Directory of Open Access Journals (Sweden)

    Caroline de Godoi Rezende Costa Molino

    Full Text Available Annually, non-communicable diseases (NCDs kill 38 million people worldwide, with low and middle-income countries accounting for three-quarters of these deaths. High-quality clinical practice guidelines (CPGs are fundamental to improving NCD management. The present study evaluated the methodological rigor and transparency of Brazilian CPGs that recommend pharmacological treatment for the most prevalent NCDs.We conducted a systematic search for CPGs of the following NCDs: asthma, atrial fibrillation, benign prostatic hyperplasia, chronic obstructive pulmonary disease, congestive heart failure, coronary artery disease and/or stable angina, dementia, depression, diabetes, gastroesophageal reflux disease, hypercholesterolemia, hypertension, osteoarthritis, and osteoporosis. CPGs comprising pharmacological treatment recommendations were included. No language or year restrictions were applied. CPGs were excluded if they were merely for local use and referred to NCDs not listed above. CPG quality was independently assessed by two reviewers using the Appraisal of Guidelines Research and Evaluation instrument, version II (AGREE II."Scope and purpose" and "clarity and presentation" domains received the highest scores. Sixteen of 26 CPGs were classified as low quality, and none were classified as high overall quality. No CPG was recommended without modification (77% were not recommended at all. After 2009, 2 domain scores ("rigor of development" and "clarity and presentation" increased (61% and 73%, respectively. However, "rigor of development" was still rated < 30%.Brazilian healthcare professionals should be concerned with CPG quality for the treatment of selected NCDs. Features that undermined AGREE II scores included the lack of a multidisciplinary team for the development group, no consideration of patients' preferences, insufficient information regarding literature searches, lack of selection criteria, formulating recommendations, authors' conflict of

  8. Lessons learned from a rigorous peer-review process for building the Climate Literacy and Energy Awareness (CLEAN) collection of high-quality digital teaching materials

    Science.gov (United States)

    Gold, A. U.; Ledley, T. S.; McCaffrey, M. S.; Buhr, S. M.; Manduca, C. A.; Niepold, F.; Fox, S.; Howell, C. D.; Lynds, S. E.

    2010-12-01

    The topic of climate change permeates all aspects of our society: the news, household debates, scientific conferences, etc. To provide students with accurate information about climate science and energy awareness, educators require scientifically and pedagogically robust teaching materials. To address this need, the NSF-funded Climate Literacy & Energy Awareness Network (CLEAN) Pathway has assembled a new peer-reviewed digital collection as part of the National Science Digital Library (NSDL) featuring teaching materials centered on climate and energy science for grades 6 through 16. The scope and framework of the collection is defined by the Essential Principles of Climate Science (CCSP 2009) and a set of energy awareness principles developed in the project. The collection provides trustworthy teaching materials on these socially relevant topics and prepares students to become responsible decision-makers. While a peer-review process is desirable for curriculum developer as well as collection builder to ensure quality, its implementation is non-trivial. We have designed a rigorous and transparent peer-review process for the CLEAN collection, and our experiences provide general guidelines that can be used to judge the quality of digital teaching materials across disciplines. Our multi-stage review process ensures that only resources with teaching goals relevant to developing climate literacy and energy awareness are considered. Each relevant resource is reviewed by two individuals to assess the i) scientific accuracy, ii) pedagogic effectiveness, and iii) usability/technical quality. A science review by an expert ensures the scientific quality and accuracy. Resources that pass all review steps are forwarded to a review panel of educators and scientists who make a final decision regarding inclusion of the materials in the CLEAN collection. Results from the first panel review show that about 20% (~100) of the resources that were initially considered for inclusion

  9. Scientific rigor through videogames.

    Science.gov (United States)

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Quality control of CT units - methodology of performance I

    International Nuclear Information System (INIS)

    Prlic, I.; Radalj, Z.

    1996-01-01

    Increasing use of x-ray computed tomography systems (CT scanners) in the diagnostic requires an efficient means of evaluating the performance of them. Therefore, this paper presents the way to measure (Quality Control procedure-Q/C) and define the CT scanner performance through a special phantom which is based on the recommendation of the American association of Physicists in Medicine (AAPM). The performance parameters measurable with the phantom represent the capability, so periodical evaluation of the parameters enable the users to recognize the stability of the CT scanner no matter on the manufacturer, model or software option of the scanner. There are five important performance parameters which are to be measured: Noise, Contrast scale, Nominal tomographic section thickness, High and Low contrast resolution (MTF). The sixth parameter is, of course the dose per scan and slice which gives the patient dose for the certain diagnostic procedure. The last but not the least parameter is the final image quality which is given through the image processing device connected to the scanner. This is the final medical information needed for the good medical practice according to the Quality Assurance (Q/A) procedures in diagnostic radiology. We have to assure the results of the performance evaluation without environmental influences (the measurements are to be made under the certain conditions according Q/A). This paper will give no detailed methodology recipe but will show on the one example; the system noise measurements and linearity; the need and relevant results of the measurements.1 The rest of the methodology is to be published. (author)

  11. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  12. Promoting Continuous Quality Improvement in the Alabama Child Health Improvement Alliance Through Q-Sort Methodology and Learning Collaboratives.

    Science.gov (United States)

    Fifolt, Matthew; Preskitt, Julie; Rucks, Andrew; Corvey, Kathryn; Benton, Elizabeth Cason

    Q-sort methodology is an underutilized tool for differentiating among multiple priority measures. The authors describe steps to identify, delimit, and sort potential health measures and use selected priority measures to establish an overall agenda for continuous quality improvement (CQI) activities within learning collaboratives. Through an iterative process, the authors vetted a list of potential child and adolescent health measures. Multiple stakeholders, including payers, direct care providers, and organizational representatives sorted and prioritized measures, using Q-methodology. Q-methodology provided the Alabama Child Health Improvement Alliance (ACHIA) an objective and rigorous approach to system improvement. Selected priority measures were used to design learning collaboratives. An open dialogue among stakeholders about state health priorities spurred greater organizational buy-in for ACHIA and increased its credibility as a statewide provider of learning collaboratives. The integrated processes of Q-sort methodology, learning collaboratives, and CQI offer a practical yet innovative way to identify and prioritize state measures for child and adolescent health and establish a learning agenda for targeted quality improvement activities.

  13. Complicating Methodological Transparency

    Science.gov (United States)

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  14. Can Quality Improvement improve the Quality of Care? A systematic review of effects and methodological rigor of the Plan-Do-Study-Act (PDSA) method

    DEFF Research Database (Denmark)

    Knudsen, Søren Valgreen; Laursen, Henrik Vitus Bering; Bartels, Paul Daniel

    2018-01-01

    healthcare systems, where PDSA cycles are becoming central in national QI strategies. Before the health systems start to enroll these vast strategies, it is important to document whether the PDSA method provide an effect in terms of better clinical practices and outcomes. The scientific literature indicates...... that the PDSA method have not been used properly. Improper use of the method is a challenge for the internal and external validity of the method and makes it difficult to establish a relation between the use of PDSA and the effects on QI projects. However, in the recent years there has been an increased focus...... against the key features: use of iterative cycles, prediction-based tests of change, testing from small to large scale and use of data over time. The assessment was performed by two independent reviewers. Results: 106 of 176 individual studies identified met the inclusion criteria. 3/5 of these documented...

  15. Research on quality assurance classification methodology for domestic AP1000 nuclear power projects

    International Nuclear Information System (INIS)

    Bai Jinhua; Jiang Huijie; Li Jingyan

    2012-01-01

    To meet the quality assurance classification requirements of domestic nuclear safety codes and standards, this paper analyzes the quality assurance classification methodology of domestic AP1000 nuclear power projects at present, and proposes the quality assurance classification methodology for subsequent AP1000 nuclear power projects. (authors)

  16. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Science.gov (United States)

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  17. Economic evaluations of occupational health interventions from a corporate perspective - A systematic review of methodological quality

    NARCIS (Netherlands)

    Uegaki, K.; Bruijne, M.C. de; Lambeek, L.; Anema, J.R.; Beek, A.J. van der; Mechelen, W. van; Tulder, M.W. van

    2010-01-01

    Objective: Using a standardized quality criteria list, we appraised the methodological quality of economic evaluations of occupational safety and health (OSH) interventions conducted from a corporate perspective. Methods: The primary literature search was conducted in Medline and Embase.

  18. Clinical practice guidelines and consensus statements in oncology--an assessment of their methodological quality.

    Directory of Open Access Journals (Sweden)

    Carmel Jacobs

    consistently lower than the others over both domains. No journals adhered to all the items related to the transparency of document development. One journal's consensus statements endorsed a product made by the sponsoring pharmaceutical company in 64% of cases.Guidance documents are an essential part of oncology care and should be subjected to a rigorous and validated development process. Consensus statements had lower methodological quality than clinical practice guidelines using AGREE II. At a minimum, journals should ensure that that all consensus statements and clinical practice guidelines adhere to AGREE II criteria. Journals should consider explicitly requiring guidelines to declare pharmaceutical company sponsorship and to identify the sponsor's product to enhance transparency.

  19. An objective methodology for the evaluation of the air quality stations positioning

    International Nuclear Information System (INIS)

    Benassi, A.; Marson, G.; Baraldo, E.; Dalan, F.; Lorenzet, K.; Bellasio, R.; Bianconi, R.

    2006-01-01

    This work describes a methodology for the evaluation of the correct positioning of the monitoring stations of an air quality network. The methodology is based on the Italian legislation, the European Directives and on some technical documents used as guidelines at European level. The paper describes all the assumption on which the methodology is based and the results of its application to the air quality network of Region Veneto (Italy) [it

  20. [Assessment of the methodological quality of theses submitted to the Faculty of Medicine Fez].

    Science.gov (United States)

    Boly, A; Tachfouti, N; Zohoungbogbo, I S S; Achhab, Y El; Nejjari, C

    2014-06-09

    A thesis in medicine is a scientific work which allows a medical student to acquire a Doctor of Medicine degree. It is therefore recommended that theses presented by students fulfill essential methodological criteria in order to obtain scientifically credible results and recommendations. The aim of this study was to assess the methodology of thesis presented to the Faculty of Medicine in Fez in 2008. We developed an evaluation table containing questions on the different sections of the IMRAD structure on which these theses were based and we estimated the proportion of theses that conformed to each criterion. There were 160 theses on various specialties presented in 2008. The majority of the theses (79.3%) were case series. Research questions were clearly expressed in 62.0% but the primary objectives were pertinent in only 52.0%. Our study shows that there were important deficiencies in the methodological rigor of the theses and very little representation of the theses in publications.

  1. OpenKnowledge Deliverable 3.3.: A methodology for ontology matching quality evaluation

    OpenAIRE

    Yatskevich, Mikalai; Giunchiglia, Fausto; McNeill, Fiona; Shvaiko, Pavel

    2007-01-01

    This document presents an evaluation methodology for the assessment of quality results produced by ontology matchers. In particular, it discusses: (i) several standard quality measures used in the ontology matching evaluation, (ii) a methodology of how to build semiautomatically an incomplete reference alignment allowing for the assessment of quality results produced by ontology matchers and (iii) a preliminary empirical evaluation of the OpenKnowledge ontology matching component.

  2. Methodology for Evaluating Quality and Reusability of Learning Objects

    Science.gov (United States)

    Kurilovas, Eugenijus; Bireniene, Virginija; Serikoviene, Silvija

    2011-01-01

    The aim of the paper is to present the scientific model and several methods for the expert evaluation of quality of learning objects (LOs) paying especial attention to LOs reusability level. The activities of eQNet Quality Network for a European Learning Resource Exchange (LRE) aimed to improve reusability of LOs of European Schoolnet's LRE…

  3. Dictionary quality and dictionary design: a methodology for ...

    African Journals Online (AJOL)

    Although recent dictionaries for the ESL market have been praised for their innovative design features, the prime concern of users, lexicographers and metalexicographers is the functional quality of the dictionary products provided for the market. The functional quality of dictionaries and the scientific assessment thereof ...

  4. A food quality management research methodology integrating technological and managerial theories

    NARCIS (Netherlands)

    Luning, P.A.; Marcelis, W.J.

    2009-01-01

    In this article it is argued how the complexity of food quality management combined with the high requirements on food quality requires a specific research methodology. It is concluded that food quality management research has to deal with two quite different paradigms, the one from technological

  5. [Rigor mortis -- a definite sign of death?].

    Science.gov (United States)

    Heller, A R; Müller, M P; Frank, M D; Dressler, J

    2005-04-01

    In the past years an ongoing controversial debate exists in Germany, regarding quality of the coroner's inquest and declaration of death by physicians. We report the case of a 90-year old female, who was found after an unknown time following a suicide attempt with benzodiazepine. The examination of the patient showed livores (mortis?) on the left forearm and left lower leg. Moreover, rigor (mortis?) of the left arm was apparent which prevented arm flexion and extension. The hypothermic patient with insufficient respiration was intubated and mechanically ventilated. Chest compressions were not performed, because central pulses were (hardly) palpable and a sinus bradycardia 45/min (AV-block 2 degrees and sole premature ventricular complexes) was present. After placement of an intravenous line (17 G, external jugular vein) the hemodynamic situation was stabilized with intermittent boli of epinephrine and with sodium bicarbonate. With improved circulation livores and rigor disappeared. In the present case a minimal central circulation was noted, which could be stabilized, despite the presence of certain signs of death ( livores and rigor mortis). Considering the finding of an abrogated peripheral perfusion (livores), we postulate a centripetal collapse of glycogen and ATP supply in the patients left arm (rigor), which was restored after resuscitation and reperfusion. Thus, it appears that livores and rigor are not sensitive enough to exclude a vita minima, in particular in hypothermic patients with intoxications. Consequently a careful ABC-check should be performed even in the presence of apparently certain signs of death, to avoid underdiagnosing a vita minima. Additional ECG- monitoring is required to reduce the rate of false positive declarations of death. To what extent basic life support by paramedics should commence when rigor and livores are present until physician DNR order, deserves further discussion.

  6. The Statistical point of view of Quality: the Lean Six Sigma methodology.

    Science.gov (United States)

    Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto

    2015-04-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.

  7. Improving the Quality of Experience Journals: Training Educational Psychology Students in Basic Qualitative Methodology

    Science.gov (United States)

    Reynolds-Keefer, Laura

    2010-01-01

    This study evaluates the impact of teaching basic qualitative methodology to preservice teachers enrolled in an educational psychology course in the quality of observation journals. Preservice teachers enrolled in an educational psychology course requiring 45 hr of field experience were given qualitative methodological training as a part of the…

  8. Methodological and reporting quality in laboratory studies of human eating behavior

    NARCIS (Netherlands)

    Robinson, E.; Bevelander, K.E.; Field, M.; Jones, A.

    2018-01-01

    The methodological quality and reporting practices of laboratory studies of human eating behavior determine the validity and replicability of nutrition science. The aim of this research was to examine basic methodology and reporting practices in recent representative laboratory studies of human

  9. Automatic ECG quality scoring methodology: mimicking human annotators

    International Nuclear Information System (INIS)

    Johannesen, Lars; Galeotti, Loriano

    2012-01-01

    An algorithm to determine the quality of electrocardiograms (ECGs) can enable inexperienced nurses and paramedics to record ECGs of sufficient diagnostic quality. Previously, we proposed an algorithm for determining if ECG recordings are of acceptable quality, which was entered in the PhysioNet Challenge 2011. In the present work, we propose an improved two-step algorithm, which first rejects ECGs with macroscopic errors (signal absent, large voltage shifts or saturation) and subsequently quantifies the noise (baseline, powerline or muscular noise) on a continuous scale. The performance of the improved algorithm was evaluated using the PhysioNet Challenge database (1500 ECGs rated by humans for signal quality). We achieved a classification accuracy of 92.3% on the training set and 90.0% on the test set. The improved algorithm is capable of detecting ECGs with macroscopic errors and giving the user a score of the overall quality. This allows the user to assess the degree of noise and decide if it is acceptable depending on the purpose of the recording. (paper)

  10. Economic evaluation studies in reproductive medicine: a systematic review of methodologic quality

    NARCIS (Netherlands)

    Moolenaar, Lobke M.; Vijgen, Sylvia M. C.; Hompes, Peter; van der Veen, Fulco; Mol, Ben Willem J.; Opmeer, Brent C.

    2013-01-01

    To evaluate the methodologic quality of economic analyses published in the field of reproductive medicine. Systematic review. Centers for reproductive care. Infertility patients. We performed a Medline search to identify economic evaluation studies in reproductive medicine. We included studies that

  11. The Development of a Service-Learning Program for First-Year Students Based on the Hallmarks of High Quality Service-Learning and Rigorous Program Evaluation

    Science.gov (United States)

    Smith, Bradley H.; Gahagan, James; McQuillin, Samuel; Haywood, Benjamin; Cole, Caroline Pender; Bolton, Clay; Wampler, Mary Katherine

    2011-01-01

    We describe six hallmarks of high quality service-learning and explain how these considerations guided the development of a Transitional Coaching Program (TCP) during the first three years of implementation. We have demonstrated that the TCP is acceptable, feasible, and sustainable. Improvements have been seen in the degree of impact on learning…

  12. A methodology of healthcare quality measurement: a case study

    International Nuclear Information System (INIS)

    Pecoraro, F; Luzi, D; Federico II, Naples (Italy))" data-affiliation=" (Department of Electrical Engineering and Information Technology and Interuniversity Centre of Bioengineering of the Human Neuromusculoskeletal System, University of Naples, Federico II, Naples (Italy))" >Cesarelli, M; Clemente, F

    2015-01-01

    In this paper we present a comprehensive model for quality assessment taking into account structure, process and outcome dimensions introduced in the Donabedian framework. To test our hypothesis a case study based on the Italian healthcare services is reported focusing on the analysis of the hospital bed management and on the phenomenon of both active and passive patient mobility

  13. Measurement of quality of life I. A methodological framework

    DEFF Research Database (Denmark)

    Ventegodt, Søren; Hilden, Jørgen; Merrick, Joav

    2003-01-01

    meet to provide a sound basis for investigation by questionnaire. The seven criteria or desiderata are: (1) an explicit definition of quality of life; (2) a coherent philosophy of human life from which the definition is derived; (3) a theory that operationalizes the philosophy by specifying unambiguous...

  14. Terminology and methodology in modelling for water quality management

    DEFF Research Database (Denmark)

    Carstensen, J.; Vanrolleghem, P.; Rauch, W.

    1997-01-01

    There is a widespread need for a common terminology in modelling for water quality management. This paper points out sources of confusion in the communication between researchers due to misuse of existing terminology or use of unclear terminology. The paper attempts to clarify the context...... of the most widely used terms for characterising models and within the process of model building. It is essential to the ever growing society of researchers within water quality management, that communication is eased by establishing a common terminology. This should not be done by giving broader definitions...... of the terms, but by stressing the use of a stringent terminology. Therefore, the goal of the paper is to advocate the use of such a well defined and clear terminology. (C) 1997 IAWQ. Published by Elsevier Science Ltd....

  15. KAIZEN METHODOLOGY IN QUALITY MANAGEMENT TO REDUCE WASTES

    Directory of Open Access Journals (Sweden)

    Boca Gratiela Dana

    2011-01-01

    Full Text Available Kaizen cannot deal with all problems and one size does not fit all but a flexibleimplementation opens up the dimension of collaboration between the workforce, themanagement and the technical departments. In every company, it operates in a differentway to suit the circumstances but all consider it indispensable. It is more flexible andtolerant than may be expected; it is a tool for integration of technological strategy with thebusiness strategy of the organization. Technology is forcing organizations to become morecompetitive as at every instance innovations take place. Recent innovations in the form oftotal quality management, reengineering work process, flexible manufacturing system haveonly one thing in common - well serving the customer by improved operational efficiency.For instant Quality Management advocates emphasize the importance of achieving higherquality and flexibility at lower level of cost and waste.

  16. The economic valuation of environmental quality: A methodological study

    International Nuclear Information System (INIS)

    Sung Yucsheng.

    1991-01-01

    Conducted in the context of sportfishing, this study uses Michigan data to estimate fishing demand and resulting consumer surplus accruing from environmental-policy implementation. For the modeling of fish-species and site decisions, a nested multinomial logit model is employed. On a pre-determined choice occasion during which a trip of a specific duration will be taken, an angler is assumed first to make a fish-species decision, then choose a site. The seasonal-participation decision is modeled by a competing-risks stochastic renewal process, incorporating time-varying parameters to account for changes in site quality through time. Since the number of trips would most likely change after a potential site-quality improvement, the proposed seasonal compensating variation (CV) calculation takes into account (1) the CV associated with the trips that would have been taken before the quality improvement, and (2) the CV associated with the new trips. The approach proposed in this study is applied to two real world policy scenarios: The termination of the Ludington Pumped Storage plant operation, and the removal of PCB contamination in the Kalamazoo River. The compensating variation is derived for both applications

  17. Putrefactive rigor: apparent rigor mortis due to gas distension.

    Science.gov (United States)

    Gill, James R; Landi, Kristen

    2011-09-01

    Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.

  18. [Methodological quality and reporting quality evaluation of randomized controlled trials published in China Journal of Chinese Materia Medica].

    Science.gov (United States)

    Yu, Dan-Dan; Xie, Yan-Ming; Liao, Xing; Zhi, Ying-Jie; Jiang, Jun-Jie; Chen, Wei

    2018-02-01

    To evaluate the methodological quality and reporting quality of randomized controlled trials(RCTs) published in China Journal of Chinese Materia Medica, we searched CNKI and China Journal of Chinese Materia webpage to collect RCTs since the establishment of the magazine. The Cochrane risk of bias assessment tool was used to evaluate the methodological quality of RCTs. The CONSORT 2010 list was adopted as reporting quality evaluating tool. Finally, 184 RCTs were included and evaluated methodologically, of which 97 RCTs were evaluated with reporting quality. For the methodological evaluating, 62 trials(33.70%) reported the random sequence generation; 9(4.89%) trials reported the allocation concealment; 25(13.59%) trials adopted the method of blinding; 30(16.30%) trials reported the number of patients withdrawing, dropping out and those lost to follow-up;2 trials (1.09%) reported trial registration and none of the trial reported the trial protocol; only 8(4.35%) trials reported the sample size estimation in details. For reporting quality appraising, 3 reporting items of 25 items were evaluated with high-quality,including: abstract, participants qualified criteria, and statistical methods; 4 reporting items with medium-quality, including purpose, intervention, random sequence method, and data collection of sites and locations; 9 items with low-quality reporting items including title, backgrounds, random sequence types, allocation concealment, blindness, recruitment of subjects, baseline data, harms, and funding;the rest of items were of extremely low quality(the compliance rate of reporting item<10%). On the whole, the methodological and reporting quality of RCTs published in the magazine are generally low. Further improvement in both methodological and reporting quality for RCTs of traditional Chinese medicine are warranted. It is recommended that the international standards and procedures for RCT design should be strictly followed to conduct high-quality trials

  19. Impact Evaluation of Quality Assurance in Higher Education: Methodology and Causal Designs

    Science.gov (United States)

    Leiber, Theodor; Stensaker, Bjørn; Harvey, Lee

    2015-01-01

    In this paper, the theoretical perspectives and general methodological elements of impact evaluation of quality assurance in higher education institutions are discussed, which should be a cornerstone of quality development in higher education and contribute to improving the knowledge about the effectiveness (or ineffectiveness) of quality…

  20. Quality Assurance and Its Impact from Higher Education Institutions' Perspectives: Methodological Approaches, Experiences and Expectations

    Science.gov (United States)

    Bejan, Stelian Andrei; Janatuinen, Tero; Jurvelin, Jouni; Klöpping, Susanne; Malinen, Heikki; Minke, Bernhard; Vacareanu, Radu

    2015-01-01

    This paper reports on methodological approaches, experiences and expectations referring to impact analysis of quality assurance from the perspective of three higher education institutions (students, teaching staff, quality managers) from Germany, Finland and Romania. The presentations of the three sample institutions focus on discussing the core…

  1. Methodology of clinical measures of healthcare quality delivered to patients with cardiovascular diseases

    Directory of Open Access Journals (Sweden)

    Posnenkova O.M.

    2014-03-01

    Full Text Available The results of implementation the methodology proposed by American Colleague of Cardiology and American Heart Association (ACC/AHA for development of Russian clinical quality measures for patients with arterial hypertension, coronary heart disease and chronic heart failure. Created quality measures cover the key elements of medical care influencing directly on clinical outcomes of treatment.

  2. Methodological Validation of Quality of Life Questionnaire for Coal Mining Groups-Indian Scenario

    Science.gov (United States)

    Sen, Sayanti; Sen, Goutam; Tewary, B. K.

    2012-01-01

    Maslow's hierarchy-of-needs theory has been used to predict development of Quality of Life (QOL) in countries over time. In this paper an attempt has been taken to derive a methodological validation of quality of life questionnaire which have been prepared for the study area. The objective of the study is to standardize a questionnaire tool to…

  3. Dosimetric quality control in radiotherapy using TLD methodology

    International Nuclear Information System (INIS)

    Saravi, M.C.; Kessler, C.; Alvarez, P.E.; Feld, D.B.

    2002-01-01

    In the frame of the IAEA Co-ordinated Research Project 'Development of a Quality Assurance Program for Radiation Therapy Dosimetry in Developing Countries' a Dosimetric Quality Control Group was set up in Argentina in 1996, to develop a program in order to improve radiotherapy in the country. Nowadays, this Group, briefly called External Audit Group (EAG), is composed by the national Secondary Standard Dosimetry Laboratory (SSDL), which has the responsibility for dose determinations, traceability to international dosimetry chain and TLD measurements, and two Medical Physicists from CNEA who are working at the Oncology Hospital 'Marie Curie' in Buenos Aires. The present paper reports the activities performed by the EAG with external high energy photon beams in reference conditions and the results of two pilot studies on cobalt 60 beams in non-reference conditions. The first step of the program was to update the existing data base about the radiotherapy centres operating in the country. A form was sent to each of them in order to obtain basic information about their staff, number and type of treatment machines, brachytherapy sources, measuring devices, beam calibration, treatment planning system, simulator and other relevant data. 90 radiotherapy centres were registered in the EAG data base. Forms were completed by 75/90 centres. There are nowadays 69 cobalt 60 units and 42 LINACs operating in the country (18/42 LINACs producing high energy X ray and electron beams). EAG deals with measurements performed with mailed TLD irradiated at radiotherapy centres. Internal quality control on our TLD system is made during each audit by means of reference capsules irradiated by IAEA; external controls consist in blind tests performed by IAEA once a year. The correction factor, K en , determined at our SSDL for high energy X-rays was checked with the collaboration of IAEA and Prague National Radiation Protection Institute (PNRPI) by means of a blind test. Results for 4 MV, 6 MV

  4. Mathematical Rigor in Introductory Physics

    Science.gov (United States)

    Vandyke, Michael; Bassichis, William

    2011-10-01

    Calculus-based introductory physics courses intended for future engineers and physicists are often designed and taught in the same fashion as those intended for students of other disciplines. A more mathematically rigorous curriculum should be more appropriate and, ultimately, more beneficial for the student in his or her future coursework. This work investigates the effects of mathematical rigor on student understanding of introductory mechanics. Using a series of diagnostic tools in conjunction with individual student course performance, a statistical analysis will be performed to examine student learning of introductory mechanics and its relation to student understanding of the underlying calculus.

  5. The relationship between return on investment and quality of study methodology in workplace health promotion programs.

    Science.gov (United States)

    Baxter, Siyan; Sanderson, Kristy; Venn, Alison J; Blizzard, C Leigh; Palmer, Andrew J

    2014-01-01

    To determine the relationship between return on investment (ROI) and quality of study methodology in workplace health promotion programs. Data were obtained through a systematic literature search of National Health Service Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE), Health Technology Database (HTA), Cost Effectiveness Analysis (CEA) Registry, EconLit, PubMed, Embase, Wiley, and Scopus. Included were articles written in English or German reporting cost(s) and benefit(s) and single or multicomponent health promotion programs on working adults. Return-to-work and workplace injury prevention studies were excluded. Methodological quality was graded using British Medical Journal Economic Evaluation Working Party checklist. Economic outcomes were presented as ROI. ROI was calculated as ROI = (benefits - costs of program)/costs of program. Results were weighted by study size and combined using meta-analysis techniques. Sensitivity analysis was performed using two additional methodological quality checklists. The influences of quality score and important study characteristics on ROI were explored. Fifty-one studies (61 intervention arms) published between 1984 and 2012 included 261,901 participants and 122,242 controls from nine industry types across 12 countries. Methodological quality scores were highly correlated between checklists (r = .84-.93). Methodological quality improved over time. Overall weighted ROI [mean ± standard deviation (confidence interval)] was 1.38 ± 1.97 (1.38-1.39), which indicated a 138% return on investment. When accounting for methodological quality, an inverse relationship to ROI was found. High-quality studies (n = 18) had a smaller mean ROI, 0.26 ± 1.74 (.23-.30), compared to moderate (n = 16) 0.90 ± 1.25 (.90-.91) and low-quality (n = 27) 2.32 ± 2.14 (2.30-2.33) studies. Randomized control trials (RCTs) (n = 12) exhibited negative ROI, -0.22 ± 2.41(-.27 to -.16). Financial returns become

  6. Quality Improvement Methodologies Increase Autologous Blood Product Administration

    Science.gov (United States)

    Hodge, Ashley B.; Preston, Thomas J.; Fitch, Jill A.; Harrison, Sheilah K.; Hersey, Diane K.; Nicol, Kathleen K.; Naguib, Aymen N.; McConnell, Patrick I.; Galantowicz, Mark

    2014-01-01

    Abstract: Whole blood from the heart–lung (bypass) machine may be processed through a cell salvaging device (i.e., cell saver [CS]) and subsequently administered to the patient during cardiac surgery. It was determined at our institution that CS volume was being discarded. A multidisciplinary team consisting of anesthesiologists, perfusionists, intensive care physicians, quality improvement (QI) professionals, and bedside nurses met to determine the challenges surrounding autologous blood delivery in its entirety. A review of cardiac surgery patients’ charts (n = 21) was conducted for analysis of CS waste. After identification of practices that were leading to CS waste, interventions were designed and implemented. Fishbone diagram, key driver diagram, Plan–Do–Study–Act (PDSA) cycles, and data collection forms were used throughout this QI process to track and guide progress regarding CS waste. Of patients under 6 kg (n = 5), 80% had wasted CS blood before interventions, whereas those patients larger than 36 kg (n = 8) had 25% wasted CS before interventions. Seventy-five percent of patients under 6 kg who had wasted CS blood received packed red blood cell transfusions in the cardiothoracic intensive care unit within 24 hours of their operation. After data collection and didactic education sessions (PDSA Cycle I), CS blood volume waste was reduced to 5% in all patients. Identification and analysis of the root cause followed by implementation of education, training, and management of change (PDSA Cycle II) resulted in successful use of 100% of all CS blood volume. PMID:24783313

  7. "Assessing the methodological quality of systematic reviews in radiation oncology: A systematic review".

    Science.gov (United States)

    Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen

    2017-10-01

    The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Does Deregulation of Quality Standards in Telecomunications Improve Social Welfare? A Methodological Note Does Deregulation of Quality Standards in Telecomunications Improve Social Welfare? A Methodological Note

    OpenAIRE

    Felipe Morandé

    1990-01-01

    Does Deregulation of Quality Standards in Telecomunications Improve Social Welfare? A Methodological Note One of the main reasons behind the bit difference observed in the per capita number of telephones between develope and developing countries is the high capital cost -a scarce resource in LDC's- of expanding telecommunications infrastructure. A reasonable question to raise in this context is the extent to which that high capital cost of investment could be diminished if international quali...

  9. Survey of the prevalence and methodology of quality assurance for B-mode ultrasound image quality among veterinary sonographers.

    Science.gov (United States)

    Hoscheit, Larry P; Heng, Hock Gan; Lim, Chee Kin; Weng, Hsin-Yi

    2018-05-01

    Image quality in B-mode ultrasound is important as it reflects the diagnostic accuracy and diagnostic information provided during clinical scanning. Quality assurance programs for B-mode ultrasound systems/components are comprised of initial quality acceptance testing and subsequent regularly scheduled quality control testing. The importance of quality assurance programs for B-mode ultrasound image quality using ultrasound phantoms is well documented in the human medical and medical physics literature. The purpose of this prospective, cross-sectional, survey study was to determine the prevalence and methodology of quality acceptance testing and quality control testing of image quality for ultrasound system/components among veterinary sonographers. An online electronic survey was sent to 1497 members of veterinary imaging organizations: the American College of Veterinary Radiology, the Veterinary Ultrasound Society, and the European Association of Veterinary Diagnostic Imaging, and a total of 167 responses were received. The results showed that the percentages of veterinary sonographers performing quality acceptance testing and quality control testing are 42% (64/151; 95% confidence interval 34-52%) and 26% (40/156: 95% confidence interval 19-33%) respectively. Of the respondents who claimed to have quality acceptance testing or quality control testing of image quality in place for their ultrasound system/components, 0% have performed quality acceptance testing or quality control testing correctly (quality acceptance testing 95% confidence interval: 0-6%, quality control testing 95% confidence interval: 0-11%). Further education and guidelines are recommended for veterinary sonographers in the area of quality acceptance testing and quality control testing for B-mode ultrasound equipment/components. © 2018 American College of Veterinary Radiology.

  10. Methodological quality and descriptive characteristics of prosthodontic-related systematic reviews.

    Science.gov (United States)

    Aziz, T; Compton, S; Nassar, U; Matthews, D; Ansari, K; Flores-Mir, C

    2013-04-01

    Ideally, healthcare systematic reviews (SRs) should be beneficial to practicing professionals in making evidence-based clinical decisions. However, the conclusions drawn from SRs are directly related to the quality of the SR and of the included studies. The aim was to investigate the methodological quality and key descriptive characteristics of SRs published in prosthodontics. Methodological quality was analysed using the Assessment of Multiple Reviews (AMSTAR) tool. Several electronic resources (MEDLINE, EMBASE, Web of Science and American Dental Association's Evidence-based Dentistry website) were searched. In total 106 SRs were located. Key descriptive characteristics and methodological quality features were gathered and assessed, and descriptive and inferential statistical testing performed. Most SRs in this sample originated from the European continent followed by North America. Two to five authors conducted most SRs; the majority was affiliated with academic institutions and had prior experience publishing SRs. The majority of SRs were published in specialty dentistry journals, with implant or implant-related topics, the primary topics of interest for most. According to AMSTAR, most quality aspects were adequately fulfilled by less than half of the reviews. Publication bias and grey literature searches were the most poorly adhered components. Overall, the methodological quality of the prosthodontic-related systematic was deemed limited. Future recommendations would include authors to have prior training in conducting SRs and for journals to include a universal checklist that should be adhered to address all key characteristics of an unbiased SR process. © 2013 Blackwell Publishing Ltd.

  11. The methodological quality of systematic reviews of animal studies in dentistry.

    Science.gov (United States)

    Faggion, C M; Listl, S; Giannakopoulos, N N

    2012-05-01

    Systematic reviews and meta-analyses of animal studies are important for improving estimates of the effects of treatment and for guiding future clinical studies on humans. The purpose of this systematic review was to assess the methodological quality of systematic reviews and meta-analyses of animal studies in dentistry through using a validated checklist. A literature search was conducted independently and in duplicate in the PubMed and LILACS databases. References in selected systematic reviews were assessed to identify other studies not captured by the electronic searches. The methodological quality of studies was assessed independently and in duplicate by using the AMSTAR checklist; the quality was scored as low, moderate, or high. The reviewers were calibrated before the assessment and agreement between them was assessed using Cohen's Kappa statistic. Of 444 studies retrieved, 54 systematic reviews were selected after full-text assessment. Agreement between the reviewers was regarded as excellent. Only two studies were scored as high quality; 17 and 35 studies were scored as medium and low quality, respectively. There is room for improvement of the methodological quality of systematic reviews of animal studies in dentistry. Checklists, such as AMSTAR, can guide researchers in planning and executing systematic reviews and meta-analyses. For determining the need for additional investigations in animals and in order to provide good data for potential application in human, such reviews should be based on animal experiments performed according to sound methodological principles. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Identifying approaches for assessing methodological and reporting quality of systematic reviews

    DEFF Research Database (Denmark)

    Pussegoda, Kusala; Turner, Lucy; Garritty, Chantelle

    2017-01-01

    there are potential gaps in research best-practice guidance materials. The aims of this study are to identify reports assessing the methodological quality (MQ) and/or reporting quality (RQ) of a cohort of SRs and to assess their number, general characteristics, and approaches to 'quality' assessment over time......BACKGROUND: The methodological quality and completeness of reporting of the systematic reviews (SRs) is fundamental to optimal implementation of evidence-based health care and the reduction of research waste. Methods exist to appraise SRs yet little is known about how they are used in SRs or where...... or reporting guidelines used as proxy to assess RQ were used in 80% (61/76) of identified reports. These included two reporting guidelines (PRISMA and QUOROM) and five quality assessment tools (AMSTAR, R-AMSTAR, OQAQ, Mulrow, Sacks) and GRADE criteria. The remaining 24% (18/76) of reports developed their own...

  13. A quality evaluation methodology of health web-pages for non-professionals.

    Science.gov (United States)

    Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro

    2004-06-01

    The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.

  14. A case of instantaneous rigor?

    Science.gov (United States)

    Pirch, J; Schulz, Y; Klintschar, M

    2013-09-01

    The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.

  15. #eVALUate: Monetizing Service Acquisition Trade-offs Using the QUALITY-INFUSED Price Methodology

    Science.gov (United States)

    2016-04-01

    they determine their source selection methodology along the best-value spectrum, ranging from lowest price technically acceptable (LPTA) to full...sidering factors determined to be germane to service value to the agency (Finkenstadt, 2015). Once an offeror’s prices are determined to be fair and... determine whether the quality rating system would affect the quality trade-off. In this case, the highest priced yet highest rated offeror was selected

  16. Guidance on assessing the methodological and reporting quality of toxicologically relevant studies: A scoping review.

    Science.gov (United States)

    Samuel, Gbeminiyi O; Hoffmann, Sebastian; Wright, Robert A; Lalu, Manoj Mathew; Patlewicz, Grace; Becker, Richard A; DeGeorge, George L; Fergusson, Dean; Hartung, Thomas; Lewis, R Jeffrey; Stephens, Martin L

    2016-01-01

    Assessments of methodological and reporting quality are critical to adequately judging the credibility of a study's conclusions and to gauging its potential reproducibility. To aid those seeking to assess the methodological or reporting quality of studies relevant to toxicology, we conducted a scoping review of the available guidance with respect to four types of studies: in vivo and in vitro, (quantitative) structure-activity relationships ([Q]SARs), physico-chemical, and human observational studies. Our aims were to identify the available guidance in this diverse literature, briefly summarize each document, and distill the common elements of these documents for each study type. In general, we found considerable guidance for in vivo and human studies, but only one paper addressed in vitro studies exclusively. The guidance for (Q)SAR studies and physico-chemical studies was scant but authoritative. There was substantial overlap across guidance documents in the proposed criteria for both methodological and reporting quality. Some guidance documents address toxicology research directly, whereas others address preclinical research generally or clinical research and therefore may not be fully applicable to the toxicology context without some translation. Another challenge is the degree to which assessments of methodological quality in toxicology should focus on risk of bias - as in clinical medicine and healthcare - or be broadened to include other quality measures, such as confirming the identity of test substances prior to exposure. Our review is intended primarily for those in toxicology and risk assessment seeking an entry point into the extensive and diverse literature on methodological and reporting quality applicable to their work. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Monitoring muscle optical scattering properties during rigor mortis

    Science.gov (United States)

    Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.

    2007-09-01

    Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.

  18. The methodological quality of economic evaluation studies in obstetrics and gynecology: a systematic review

    NARCIS (Netherlands)

    Vijgen, Sylvia M. C.; Opmeer, Brent C.; Mol, Ben Willem J.

    2013-01-01

    We evaluated the methodological quality of economic evaluation studies in the field of obstetrics and gynecology published in the last decade. A MEDLINE search was performed to find economic evaluation studies in obstetrics and gynecology from the years 1997 through 2009. We included full economic

  19. Development of Management Quality Assessment Methodology in the Public Sector: Problems and Contradictions

    Directory of Open Access Journals (Sweden)

    Olga Vladimirovna Kozhevina

    2015-09-01

    Full Text Available The development management quality assessment methodology in the public sector is relevant scientific and practical problem of economic research. The utilization of the results of the assessment on the basis of the authors’ methodology allows us to rate the public sector organizations, to justify decisions on the reorganization and privatization, and to monitor changes in the level of the management quality of the public sector organizations. The study determined the place of the quality of the control processes of the public sector organization in the system of “Quality of public administration — the effective operation of the public sector organization,” the contradictions associated with the assessment of management quality are revealed, the conditions for effective functioning of the public sector organizations are proved, a mechanism of comprehensive assessment and algorithm for constructing and evaluating the control models of management quality are developed, the criteria for assessing the management quality in the public sector organizations, including economic, budgetary, social and public, informational, innovation and institutional criteria are empirically grounded. By utilizing the proposed algorithm, the assessment model of quality management in the public sector organizations, including the financial, economic, social, innovation, informational and institutional indicators is developed. For each indicator of quality management, the coefficients of importance in the management quality assessment model, as well as comprehensive and partial evaluation indicators are determined on the basis of the expert evaluations. The main conclusion of the article is that management quality assessment for the public sector organizations should be based not only on the indicators achieved in the dynamics and utilized for analyzing the effectiveness of management, but also should take into account the reference levels for the values of these

  20. Reporting and methodologic quality of Cochrane Neonatal review group systematic reviews

    Directory of Open Access Journals (Sweden)

    Al Faleh Khalid

    2009-06-01

    Full Text Available Abstract Background The Cochrane Neonatal Review Group (CNRG has achieved a lot with limited resources in producing high quality systematic reviews to assist clinicians in evidence-based decision-making. A formal assessment of published CNRG systematic reviews has not been undertaken; we sought to provide a comprehensive assessment of the quality of systematic reviews (both methodologic and reporting quality published in CNRG. Methods We selected a random sample of published CNRG systematic reviews. Items of the QUOROM statement were utilized to assess quality of reporting, while items and total scores of the Oxman-Guyatt Overview Quality Assessment Questionnaire (OQAQ were used to assess methodologic quality. Two reviewers independently extracted data and assessed quality. A Student t-test was used to compare quality scores pre- and post-publication of the QUOROM statement. Results Sixty-one systematic reviews were assessed. Overall, the included reviews had good quality with minor flaws based on OQAQ total scores (mean, 4.5 [0.9]; 95% CI, 4.27–4.77. However, room for improvement was noted in some areas, such as the title, abstract reporting, a priori plan for heterogeneity assessment and how to handle heterogeneity in case it exists, and assessment of publication bias. In addition, reporting of agreement among reviewers, documentation of trials flow, and discussion of possible biases were addressed in the review process. Reviews published post the QUOROM statement had a significantly higher quality scores. Conclusion The systematic reviews published in the CNRG are generally of good quality with minor flaws. However, efforts should be made to improve the quality of reports. Readers must continue to assess the quality of published reports on an individual basis prior to implementing the recommendations.

  1. Non-pharmacological sleep interventions for youth with chronic health conditions: a critical review of the methodological quality of the evidence.

    Science.gov (United States)

    Brown, Cary A; Kuo, Melissa; Phillips, Leah; Berry, Robyn; Tan, Maria

    2013-07-01

    Restorative sleep is clearly linked with well-being in youth with chronic health conditions. This review addresses the methodological quality of non-pharmacological sleep intervention (NPSI) research for youth with chronic health conditions. The Guidelines for Critical Review (GCR) and the Effective Public Health Practice Project Quality Assessment Tool (EPHPP) were used in the review. The search yielded 31 behavioural and 10 non-behavioural NPSI for review. Most studies had less than 10 participants. Autism spectrum disorders, attention deficit/hyperactivity disorders, down syndrome, intellectual disabilities, and visual impairments were the conditions that most studies focused upon. The global EPHPP scores indicated most reviewed studies were of weak quality. Only 7 studies were rated as moderate, none were strong. Studies rated as weak quality frequently had recruitment issues; non-blinded participants/parents and/or researchers; and used outcome measures without sound psychometric properties. Little conclusive evidence exists for NPSIs in this population. However, NPSIs are widely used and these preliminary studies demonstrate promising outcomes. There have not been any published reports of negative outcomes that would preclude application of the different NPSIs on a case-by-case basis guided by clinical judgement. These findings support the need for more rigorous, applied research. • Methodological Quality of Sleep Research • Disordered sleep (DS) in youth with chronic health conditions is pervasive and is important to rehabilitation therapists because DS contributes to significant functional problems across psychological, physical and emotional domains. • Rehabilitation therapists and other healthcare providers receive little education about disordered sleep and are largely unaware of the range of assessment and non-pharmacological intervention strategies that exist. An evidence-based website of pediatric sleep resources can be found at http

  2. Reconciling the Rigor-Relevance Dilemma in Intellectual Capital Research

    Science.gov (United States)

    Andriessen, Daniel

    2004-01-01

    This paper raises the issue of research methodology for intellectual capital and other types of management research by focusing on the dilemma of rigour versus relevance. The more traditional explanatory approach to research often leads to rigorous results that are not of much help to solve practical problems. This paper describes an alternative…

  3. Rigor in Qualitative Supply Chain Management Research

    DEFF Research Database (Denmark)

    Goffin, Keith; Raja, Jawwad; Claes, Björn

    2012-01-01

    , reliability, and theoretical saturation. Originality/value – It is the authors' contention that the addition of the repertory grid technique to the toolset of methods used by logistics and supply chain management researchers can only enhance insights and the building of robust theories. Qualitative studies......Purpose – The purpose of this paper is to share the authors' experiences of using the repertory grid technique in two supply chain management studies. The paper aims to demonstrate how the two studies provided insights into how qualitative techniques such as the repertory grid can be made more...... rigorous than in the past, and how results can be generated that are inaccessible using quantitative methods. Design/methodology/approach – This paper presents two studies undertaken using the repertory grid technique to illustrate its application in supply chain management research. Findings – The paper...

  4. Application of Taguchi methodology to improve the functional quality of a mechanical device

    International Nuclear Information System (INIS)

    Regeai, Awatef Omar

    2005-01-01

    Manufacturing and quality control are recognized branches of engineering management. special attention has been made to improve thr tools and methods for the purpose of improving the products quality and finding solutions for any Obstacles and/or problems during the production process. Taguchi methodology is one of the most powerful techniques for improving product and manufacturing process quality at low cost. It is a strategical and practical method that aims to assist managers and industrial engineers to tackle manufacturing quality problems in a systematic and structured manner. The potential benefit of Taguchi methodology lies in its ease of use, its emphasis on reducing variability to give more economical products and hence the accessibility to the engineering fraternity for solving real life quality problems. This study applies Taguchi methodology to improve the functional quality of a local made chain gear by a purposed heat treatment process. The hardness of steel is generally a function not of its composition only, but rather of its heat treatment. The study investigates the effects of various heat treatment parameters, including ramp rate of heating, normalizing holding time, normalizing temperature, annealing holding time, annealing temperature, hardening holding time, hardening temperature, quenching media, tempering temperature and tempering holding time upon the hardness, which is a measure of resistance to plastic deformation. Both the analysis of means (ANOM) and Signal to Noise ratio (S/N) have been carried out for determining the optimal condition of the process. A significant improvement of the functional quality characteristic (hardness) by more than 32% was obtained. The Scanning Electron Microscopy technique was used in this study to obtain visual evidence of the quality and continuous improvement of the heat treated samples. (author)

  5. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    Science.gov (United States)

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  6. Methodological quality of systematic reviews analyzing the use of laser therapy in restorative dentistry.

    Science.gov (United States)

    Salmos, Janaina; Gerbi, Marleny E M M; Braz, Rodivan; Andrade, Emanuel S S; Vasconcelos, Belmiro C E; Bessa-Nogueira, Ricardo V

    2010-01-01

    The purpose of this study was to identify systematic reviews (SRs) that compared laser with other dental restorative procedures and to evaluate their methodological quality. A search strategy was developed and implemented for MEDLINE, the Cochrane Library, LILACS, and the Brazilian Dentistry Bibliography (1966- 2007). Inclusion criteria were: the article had to be an SR (+/- meta-analysis); primary focus was the use of laser in restorative dentistry; published in English, Spanish, Portuguese, Italian, German. Two investigators independently selected and evaluated the SRs. The overview quality assessment questionnaire (OQAQ) was used to evaluate methodological quality, and the results were averaged. There were 145 references identified, of which seven were SRs that met the inclusion criteria (kappa=0.81). Of the SRs, 71.4% appraised lasers in dental caries diagnosis. The mean overall OQAQ score was 4.4 [95% confidence interval (CI) 2.4- 6.5]. Of the SRs, 57.1% had major flaws, scoring methodological quality is low; therefore, clinicians should critically appraise them prior to considering their recommendations to guide patient care.

  7. Systematic Review of the Application of Lean and Six Sigma Quality Improvement Methodologies in Radiology.

    Science.gov (United States)

    Amaratunga, Thelina; Dobranowski, Julian

    2016-09-01

    Preventable yet clinically significant rates of medical error remain systemic, while health care spending is at a historic high. Industry-based quality improvement (QI) methodologies show potential for utility in health care and radiology because they use an empirical approach to reduce variability and improve workflow. The aim of this review was to systematically assess the literature with regard to the use and efficacy of Lean and Six Sigma (the most popular of the industrial QI methodologies) within radiology. MEDLINE, the Allied & Complementary Medicine Database, Embase Classic + Embase, Health and Psychosocial Instruments, and the Ovid HealthStar database, alongside the Cochrane Library databases, were searched on June 2015. Empirical studies in peer-reviewed journals were included if they assessed the use of Lean, Six Sigma, or Lean Six Sigma with regard to their ability to improve a variety of quality metrics in a radiology-centered clinical setting. Of the 278 articles returned, 23 studies were suitable for inclusion. Of these, 10 assessed Six Sigma, 7 assessed Lean, and 6 assessed Lean Six Sigma. The diverse range of measured outcomes can be organized into 7 common aims: cost savings, reducing appointment wait time, reducing in-department wait time, increasing patient volume, reducing cycle time, reducing defects, and increasing staff and patient safety and satisfaction. All of the included studies demonstrated improvements across a variety of outcomes. However, there were high rates of systematic bias and imprecision as per the Grading of Recommendations Assessment, Development and Evaluation guidelines. Lean and Six Sigma QI methodologies have the potential to reduce error and costs and improve quality within radiology. However, there is a pressing need to conduct high-quality studies in order to realize the true potential of these QI methodologies in health care and radiology. Recommendations on how to improve the quality of the literature are proposed

  8. [Types of medical registries - definitions, methodological aspects and quality of the scientific work with registries].

    Science.gov (United States)

    Mathis-Edenhofer, Stefan; Piso, Brigitte

    2011-12-01

    This work presents a comprehensive list of registry definitions including broader and narrower definitions. Compared to each other different methodological issues can be identified. Some of these issues are common for all registry types; some can be assigned more easily to a specific registry type. Instruments for evaluating the quality of registers reflect many of the mentioned aspects. Generally, and especially at registers with a descriptive or exploratory research dimension it is important to consider their intended purpose and in about it was achieved. This includes, for instance, whether the purpose and the methodology are coordinated. From the start of registration an initiator should be - based on the purpose - aware of the methodological dimension of the registry. This helps to apply the correct type of the registry, the appropriate guidance and, ultimately, the arguments for the effort (cost-benefit ratio).

  9. The implementation methodology of Total Quality Management in health services, as a best practice operation.

    Directory of Open Access Journals (Sweden)

    Theodora Malamou

    2016-09-01

    Full Text Available Total Quality Management – TQM (Total Quality Management-TQM, health services, is a modern management philosophy to improve the quality and efficiency of the organization as a whole, with the involvement of all employees, at all levels. According to the research data, the concept of quality is distinguished in technical, interpersonal quality and hotel infrastructure and focuses on patient satisfaction. The Critical success factors of TQM, organizations for business excellence in continuous competitive changing environment, is the management commitment, customer focus, constant communication with employees, encouragement and reward, education and scientific training, continuous improvement quality of service, interdependent relationships with suppliers, active employee participation, creation of representative indicators, targets and benchmarking, continuous outcome assessment and continuous review, review of program procedures. The purpose of this article is through the review of Greek and international literature, to introduce the methodology of a project TQM, to health services, as everyday best practice, with emphasis on quality of service. According to the literature review, TQM contributes to improving the quality of health services, the cultivation of team spirit, cooperation between health professionals and leadership, with a view to satisfy all. TQM is purely anthropocentric theory of organization and administration. We need comprehensive effort approach to improving the quality of leadership and the introduction of the culture of workers.

  10. Methodological quality of quantitative lesbian, gay, bisexual, and transgender nursing research from 2000 to 2010.

    Science.gov (United States)

    Johnson, Michael; Smyer, Tish; Yucha, Carolyn

    2012-01-01

    The purpose of this study was to evaluate the methodological quality of quantitative lesbian, gay, bisexual, and transgender nursing research from 2000 to 2010. Using a key word search in Cumulative Index to Nursing and Allied Health Literature, 188 studies were identified and 40 met the criteria, which included descriptive, experimental, quasi-experimental, or observational (case control, cohort, and cross-sectional) design. The methodological quality of these studies was similar to that reported for medical and nursing educational research. The foci of these lesbian, gay, bisexual, and transgender studies were biased toward human immunodeficiency virus, acquired immunodeficiency syndrome, and sexually transmitted diseases, and 58.5% of the funded research was related to human immunodeficiency virus or acquired immunodeficiency syndrome. To provide evidence-based health care to these populations, an understanding of the current state of research is crucial.

  11. Proposal of a methodology for quality control in thermoluminescent dosimetry laboratory

    International Nuclear Information System (INIS)

    Feital, Joao Carlos da S.; Almeida, Claudio Domingues de; Bezerra, Marcos A.

    2005-01-01

    Taken into account that in thermoluminescence dosimetry adequate selection procedures as well as accurate TLD readings are necessary, this paper presents results of methodology that can be applied as part of quality control programs in thermoluminescence dosimetry laboratories. For the experiment, a set of 200 TLDs ( LiF 100 ) were used and 9 from which were selected, a standard source of Cs -137 , a PTW kiln, a TL 'Harshaw' reader - model 5500 operating under the 'Win Rem' software and a Sr 90 / Y 90 'Bicron' irradiator. In the proceeding the selected dosimeters were irradiated and read 28 times during 18 months, then by one of the standard deviation properties, values up to 14 % were found, for a confidence level of 95 %. The results found and the bibliographic data related to the responses (arbitrary reading) in the crystals used in TLDs, have shown that this methodology can be applied in quality control programs. (author)

  12. Author-paper affiliation network architecture influences the methodological quality of systematic reviews and meta-analyses of psoriasis.

    Directory of Open Access Journals (Sweden)

    Juan Luis Sanz-Cabanillas

    Full Text Available Moderate-to-severe psoriasis is associated with significant comorbidity, an impaired quality of life, and increased medical costs, including those associated with treatments. Systematic reviews (SRs and meta-analyses (MAs of randomized clinical trials are considered two of the best approaches to the summarization of high-quality evidence. However, methodological bias can reduce the validity of conclusions from these types of studies and subsequently impair the quality of decision making. As co-authorship is among the most well-documented forms of research collaboration, the present study aimed to explore whether authors' collaboration methods might influence the methodological quality of SRs and MAs of psoriasis. Methodological quality was assessed by two raters who extracted information from full articles. After calculating total and per-item Assessment of Multiple Systematic Reviews (AMSTAR scores, reviews were classified as low (0-4, medium (5-8, or high (9-11 quality. Article metadata and journal-related bibliometric indices were also obtained. A total of 741 authors from 520 different institutions and 32 countries published 220 reviews that were classified as high (17.2%, moderate (55%, or low (27.7% methodological quality. The high methodological quality subnetwork was larger but had a lower connection density than the low and moderate methodological quality subnetworks; specifically, the former contained relatively fewer nodes (authors and reviews, reviews by authors, and collaborators per author. Furthermore, the high methodological quality subnetwork was highly compartmentalized, with several modules representing few poorly interconnected communities. In conclusion, structural differences in author-paper affiliation network may influence the methodological quality of SRs and MAs on psoriasis. As the author-paper affiliation network structure affects study quality in this research field, authors who maintain an appropriate balance

  13. Report on Use of a Methodology for Commissioning and Quality Assurance of a VMAT System

    OpenAIRE

    Mayo, Charles; Fong de los Santos, Luis; Kruse, Jon; Blackwell, Charles R.; McLemore, Luke B.; Pafundi, Deanna; Stoker, Joshua; Herman, Michael

    2013-01-01

    INTRODUCTION: Results of use of methodology for VMAT commissioning and quality assurance, utilizing both control point tests and dosimetric measurements are presented. METHODS AND MATERIALS: A generalizable, phantom measurement approach is used to characterize the accuracy of the measurement system. Correction for angular response of the measurement system and inclusion of couch structures are used to characterize the full range gantry angles desirable for clinical plans. A dose based daily Q...

  14. APPLICATION OF LOT QUALITY ASSURANCE SAMPLING FOR ASSESSING DISEASE CONTROL PROGRAMMES - EXAMINATION OF SOME METHODOLOGICAL ISSUES

    OpenAIRE

    T. R. RAMESH RAO

    2011-01-01

    Lot Quality Assurance Sampling (LQAS), a statistical tool in industrial setup, has been in use since 1980 for monitoring and evaluation of programs on disease control / immunization status among children / health workers performance in health system. While conducting LQAS in the field, there are occasions, even after due care of design, there are practical and methodological issues to be addressed before it is recommended for implementation and intervention. LQAS is applied under the assumpti...

  15. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  16. A methodology to improve higher education quality using the quality function deployment and analytic hierarchy process

    NARCIS (Netherlands)

    Raharjo, H.; Xie, M.; Goh, T.N.; Brombacher, A.C.

    2007-01-01

    In order to formulate an effective strategic plan in a customer-driven education context, it is important to recognize who the customers are and what they want. Using Quality Function Deployment (QFD), this information can be translated into strategies to achieve customer satisfaction. Since the

  17. Systematic Review of the Methodology Quality in Lung Cancer Screening Guidelines

    Directory of Open Access Journals (Sweden)

    Jiang LI

    2016-10-01

    Full Text Available Background and objective Lung cancer is the most common malignancy and screening can decrease the mortality. High quality screening guideline is necessary and important for effective work. Our study is to review and evaluate the basic characteristics and methodology quality of the current global lung cancer screening guidelines so as to provide useful information for domestic study in the future. Methods Electronic searches were done in English and Chinese databases including PubMed, the Cochrane Library, Web of Science, Embase, CNKI, CBM, Wanfang, and some cancer official websites. Articles were screened according to the predefined inclusion and exclusion criteria by two researchers. The quality of guidelines was assessed by AGREE II. Results At last, a total of 11 guidelines with methodology were included. The guidelines were issued mainly by USA (81%. Canada and China developed one, respectively. As for quality, the average score in the “Scale and objective” of all guidelines was 80, the average score in the “Participants” was 52, the average score in the “rigorism” was 50, the average score in the “clarity” was 76, the average score in the “application” was 43 and the average score in the “independence” was 59. The highest average score was found in 2013 and 2015. Canada guideline had higher quality in six domains. 7 guidelines were evaluated as A level. Conclusion The number of clinical guidelines showed an increasing trend. Most guidelines were issued by developed countries with heavy burden. Multi-country contribution to one guideline was another trend. Evidence-based methodology was accepted globally in the guideline development.

  18. METHODOLOGY OF DETERMINATION OF QUALITY INDEX OF MAINTENANCE SERVICE SYSTEM OF POWER EQUIPMENT OF TRACTION SUBSTATIONS

    Directory of Open Access Journals (Sweden)

    O.O. Matusevych

    2016-03-01

    Full Text Available Purpose. The purpose of this paper is development of methodology for definition of a quality system of maintenance and repair (M and P power equipment of traction substations (TS of electrified railways operating under conditions of uncertainty based on expert information. Methodology. The basic tenets of the theory of fuzzy sets and marks, linguistic and interval estimates of experts were applied to solve this problem. Results. Analysis of the existing diversity of approaches to development of modern methods of improvement of M and P allows us to conclude that the improvement in the quality of the system is achieved by solving individual problems increase the operational reliability of power equipment of traction substations in the following main interrelated areas. There are technical, economic and organizational. The basis of the quality evaluation system is initial data and expertise developed version of the document formalized quality evaluation of electrical equipment of traction substations by experts. The choice of determining the level of Quality service system based on the marks, linguistic and interval estimates of experts, which are reflected in quantitative and / or qualitative form was done. The possible options for expert data presentation and their corresponding quantitative methods of calculating the integral index of quality improvement system maintenance and P of traction substations were described. The methodology and the method of assessing the quality of system maintenance and P of TS allows quickly respond to changing operating conditions of power equipment of traction substations, and to determine the most effective strategies for maintenance of electrical and P TS under conditions of uncertainty functioning distance electricity. Originality. The method of a systematic approach to improve the quality of the system maintenance and P of power equipment of traction substation under conditions of uncertainty based on expert

  19. Relationships between abstract features and methodological quality explained variations of social media activity derived from systematic reviews about psoriasis interventions.

    Science.gov (United States)

    Ruano, J; Aguilar-Luque, M; Isla-Tejera, B; Alcalde-Mellado, P; Gay-Mimbrera, J; Hernandez-Romero, José Luis; Sanz-Cabanillas, J L; Maestre-López, B; González-Padilla, M; Carmona-Fernández, P J; Gómez-García, F; García-Nieto, A Vélez

    2018-05-24

    The aim of this study was to describe the relationship among abstract structure, readability, and completeness, and how these features may influence social media activity and bibliometric results, considering systematic reviews (SRs) about interventions in psoriasis classified by methodological quality. Systematic literature searches about psoriasis interventions were undertaken on relevant databases. For each review, methodological quality was evaluated using the Assessing the Methodological Quality of Systematic Reviews (AMSTAR) tool. Abstract extension, structure, readability, and quality and completeness of reporting were analyzed. Social media activity, which consider Twitter and Facebook mention counts, as well as Mendeley readers and Google scholar citations were obtained for each article. Analyses were conducted to describe any potential influence of abstract characteristics on review's social media diffusion. We classified 139 intervention SRs as displaying high/moderate/low methodological quality. We observed that abstract readability of SRs has been maintained high for last 20 years, although there are some differences based on their methodological quality. Free-format abstracts were most sensitive to the increase of text readability as compared with more structured abstracts (IMRAD or 8-headings), yielding opposite effects on their quality and completeness depending on the methodological quality: a worsening in low quality reviews and an improvement in those of high-quality. Both readability indices and PRISMA for Abstract total scores showed an inverse relationship with social media activity and bibliometric results in high methodological quality reviews but not in those of lower quality. Our results suggest that increasing abstract readability must be specially considered when writing free-format summaries of high-quality reviews, because this fact correlates with an improvement of their completeness and quality, and this may help to achieve broader

  20. Factor Analysis in Assessing the Research Methodology Quality of Systematic Reviews

    Directory of Open Access Journals (Sweden)

    Andrada Elena URDA-CÎMPEAN

    2011-12-01

    Full Text Available Introduction: Many high quality systematic reviews available from medical journals, data bases and other electronic sources differ in quality and provide different answers to the same question. The literature recommended the use of a checklist type approach, which exceeds many of the problems associated with measurements. Aim: This study proposes to identify in a checklist type approach the most commonly used factors (from a methodological point of view in assessing the quality of systematic reviews, and then mirror the actual stage of medical writing. We want to analyze the factors’ occurrence and / or their development in the text and in the abstract of systematic reviews published in 2011. Methods: The present study randomly selected only free full text systematic reviews published in 2011, systematic reviews found in Pubmed and in Cochrane Database. The most commonly used factors were identified in PRISMA statement and quality measurement tools. Results: The evaluated systematic reviews mentioned or developed several of the factors studied. Only 78% of the papers surveyed have used the correct IMRAD format and 59% of them have mentioned the sample size used. The correspondence between the content of the paper and its abstract is summarized in the proportion of 54.63% and 51.85% for the two sets of factors, and it can lead to scarce appreciation of the article provided that only abstracts are read. Conclusions: Researchers do not properly take into consideration scientific articles and assessment tools used for quality evaluation. They should place more value over methodological factors which help assess systematic review quality, while journals form the only party who can enforce quality standards in medical writing.

  1. The implementation of the Quality Costs Methodology in the Public Transport Enterprise in Macedonia

    Directory of Open Access Journals (Sweden)

    Elizabeta Mitreva

    2017-02-01

    Full Text Available The implementation of TQM (Total Quality Management strategy in the public transport enterprises in Macedonia means improving the quality of services through examination of business processes not just in terms of defining, improvement and design of the process, but also improvement of productivity and optimization of the costs of quality. The purpose of this study is to point out the importance of determining the quality of the transport services, its methods, and techniques for measurement of the optimization of business processes in particular. The analysis of the quality costs when providing transport services can help managers to understand the impact of poor quality on the financial results and the bad image it gives to the enterprise. In this study, we proposed and applied the model for better performance and higher efficiency of the transport enterprise, through the optimization of business processes, change in the corporate culture and use of the complete business potentials. The need for this methodology was imposed as a result of the analysis made in the company in terms of whether is it doing an analysis on the costs of quality or not. The benefits from the utilization of this model will not only lead to increasing the business performance of the transport enterprise, but this model will also serve as a driving force for continuous improvements to the satisfaction of all stakeholders.

  2. The Methodological Approaches to Formation of the Internal System of Quality of Education of Higher Education Institution

    Directory of Open Access Journals (Sweden)

    Doronin Stepan A.

    2017-12-01

    Full Text Available The necessity of creation of a national conception of formation of the internal system of quality assurance of higher education institutions with obligatory introduction of methodological approach in its structure is substantiated; the variant of its terminological provision with clarification of the content of concepts of «quality of education», «internal system of quality of education», «methodological approach» is proposed; positioning of the approach in the system of other standard scientific instruments (method, methodology, program, algorithm is done; arguments for the orientation of the methodology of formation of the internal system of quality assurance of higher education institutions towards the socio-cultural model of paradigm are provided; a hierarchical classification of methodological approaches with allocation of the philosophical, scientific, concrete-scientific, technological levels and the characteristics of their purpose and contents is presented.

  3. Quality at the source (QATS) system design under six sigma methodology

    Energy Technology Data Exchange (ETDEWEB)

    Aguirre, F; Ballasteros, I; Maricalva, J [Emperesa Nacional del Uranio, S.A. (ENUSA), Nuclear Fuel Manufacturing Plant, Juzbado, Salamanca (Spain)

    2000-07-01

    One of the main objectives in the manufacturing of fuel assemblies, is to fulfill the customer expectations with a product that assures its reliability during its stay in the NPP. By mean of the QATS System design under 6-Sigma methodology, all the customer requirements are included in the product specifications and drawings. Product characteristics and process variables are classified and process capability is evaluated. All this information permits to identify CTQ's (Critical to Quality) product characteristics and process variables, and to define a quality system (QATS) based in the process and on-line characteristics control handled by the manufacturing workers. At the end, this system ensures a continuous product quality improvement, and a strong commitment with the customer requirements. (author)

  4. Quality at the source (QATS) system design under six sigma methodology

    International Nuclear Information System (INIS)

    Aguirre, F.; Ballasteros, I.; Maricalva, J.

    2000-01-01

    One of the main objectives in the manufacturing of fuel assemblies, is to fulfill the customer expectations with a product that assures its reliability during its stay in the NPP. By mean of the QATS System design under 6-Sigma methodology, all the customer requirements are included in the product specifications and drawings. Product characteristics and process variables are classified and process capability is evaluated. All this information permits to identify CTQ's (Critical to Quality) product characteristics and process variables, and to define a quality system (QATS) based in the process and on-line characteristics control handled by the manufacturing workers. At the end, this system ensures a continuous product quality improvement, and a strong commitment with the customer requirements. (author)

  5. Development of a methodology for automated assessment of the quality of digitized images in mammography

    International Nuclear Information System (INIS)

    Santana, Priscila do Carmo

    2010-01-01

    The process of evaluating the quality of radiographic images in general, and mammography in particular, can be much more accurate, practical and fast with the help of computer analysis tools. The purpose of this study is to develop a computational methodology to automate the process of assessing the quality of mammography images through techniques of digital imaging processing (PDI), using an existing image processing environment (ImageJ). With the application of PDI techniques was possible to extract geometric and radiometric characteristics of the images evaluated. The evaluated parameters include spatial resolution, high-contrast detail, low contrast threshold, linear detail of low contrast, tumor masses, contrast ratio and background optical density. The results obtained by this method were compared with the results presented in the visual evaluations performed by the Health Surveillance of Minas Gerais. Through this comparison was possible to demonstrate that the automated methodology is presented as a promising alternative for the reduction or elimination of existing subjectivity in the visual assessment methodology currently in use. (author)

  6. Methodology for Air Quality Forecast Downscaling from Regional- to Street-Scale

    Science.gov (United States)

    Baklanov, Alexander; Nuterman, Roman; Mahura, Alexander; Amstrup, Bjarne; Hansen Saas, Bent; Havskov Sørensen, Jens; Lorenzen, Thomas; Weismann, Jakob

    2010-05-01

    The most serious air pollution events occur in cities where there is a combination of high population density and air pollution, e.g. from vehicles. The pollutants can lead to serious human health problems, including asthma, irritation of the lungs, bronchitis, pneumonia, decreased resistance to respiratory infections, and premature death. In particular air pollution is associated with increase in cardiovascular disease and lung cancer. In 2000 WHO estimated that between 2.5 % and 11 % of total annual deaths are caused by exposure to air pollution. However, European-scale air quality models are not suited for local forecasts, as their grid-cell is typically of the order of 5 to 10km and they generally lack detailed representation of urban effects. Two suites are used in the framework of the EC FP7 project MACC (Monitoring of Atmosphere Composition and Climate) to demonstrate how downscaling from the European MACC ensemble to local-scale air quality forecast will be carried out: one will illustrate capabilities for the city of Copenhagen (Denmark); the second will focus on the city of Bucharest (Romania). This work is devoted to the first suite, where methodological aspects of downscaling from regional (European/ Denmark) to urban scale (Copenhagen), and from the urban down to street scale. The first results of downscaling according to the proposed methodology are presented. The potential for downscaling of European air quality forecasts by operating urban and street-level forecast models is evaluated. This will bring a strong support for continuous improvement of the regional forecast modelling systems for air quality in Europe, and underline clear perspectives for the future regional air quality core and downstream services for end-users. At the end of the MACC project, requirements on "how-to-do" downscaling of European air-quality forecasts to the city and street levels with different approaches will be formulated.

  7. The Holy Trinity of Methodological Rigor: A Skeptical View

    Science.gov (United States)

    Coryn, Chris L. S.

    2007-01-01

    The author discusses validation hierarchies grounded in the tradition of quantitative research that generally consists of the criteria of validity, reliability and objectivity and compares this with similar criteria developed by the qualitative tradition, described as trustworthiness, dependability and confirmability. Although these quantitative…

  8. Methodology for oversizing marginal quality riprap for erosion control at uranium mill tailings sites

    International Nuclear Information System (INIS)

    Staub, W.P.; Abt, S.R.

    1987-01-01

    Properly selected and oversized local sources of riprap may provide superior erosion protection compared with revegetation at a number of uranium mill tailings sites in arid regions of the United States. Whereas highly durable rock is appropriate for protecting diversion channels to the height of the 5-year flood, marginal quality rock may be adequate for protecting infrequently flooded side slopes of diversion channels, tailings embankments and caps. Marginal quality rock may require oversizing to guarantee that design size specifications are met at the end of the performance period (200 to 1000 years). This paper discusses a methodology for oversizing marginal quality rock. Results of cyclic freezing and thawing tests are used to determine oversizing requirements as functions of the performance period and environment. Test results show that marginal quality rock may be used in frequently saturated areas but in some cases oversizing will be substantial and in other cases marginal quality rock may be disqualified. Oversizing of marginal quality rock appears to be a practical reality in occasionally saturated areas (between the 5-year and 100-year floods). Furthermore, oversizing will not generally be required on slopes from the 100-year flood. 6 refs., 4 tabs

  9. The reporting characteristics and methodological quality of Cochrane reviews about health policy research.

    Science.gov (United States)

    Xiu-xia, Li; Ya, Zheng; Yao-long, Chen; Ke-hu, Yang; Zong-jiu, Zhang

    2015-04-01

    The systematic review has increasingly become a popular tool for researching health policy. However, due to the complexity and diversity in the health policy research, it has also encountered more challenges. We set out the Cochrane reviews on health policy research as a representative to provide the first examination of epidemiological and descriptive characteristics as well as the compliance of methodological quality with the AMSTAR. 99 reviews were included by inclusion criteria, 73% of which were Implementation Strategies, 15% were Financial Arrangements and 12% were Governance Arrangements; involved Public Health (34%), Theoretical Exploration (18%), Hospital Management (17%), Medical Insurance (12%), Pharmaceutical Policy (9%), Community Health (7%) and Rural Health (2%). Only 39% conducted meta-analysis, and 49% reported being updates, and none was rated low methodological quality. Our research reveals that the quantity and quality of the evidence should be improved, especially Financial Arrangements and Governance Arrangements involved Rural Health, Health Care Reform and Health Equity, etc. And the reliability of AMSTAR needs to be tested in larger range in this field. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Methodology for Assessing the Quality of Agribusiness Activity Based on the Environmentally Responsible Approach

    Directory of Open Access Journals (Sweden)

    Anna Antonovna Anfinogentova

    2017-06-01

    Full Text Available The article is devoted to the research and development of quality evaluation methods of agro-industrial enterprises activity in the regional economy with the use of the ecological approach. The hypothesis of the study is that the activity of the economic entities (as well as of agribusiness must be assessed not only in the context of economic efficiency and effectiveness, but also in the context of environmental ethics and environmental aggression. As the initial data, we have used the indicators of economic statistics of Russian agrarian-oriented regions, as well as the data received from management reporting on the sample of enterprises of three regions (the Belgorod and Moscow regions, Krasnodar Territory. The article offers the economic and mathematical approach for measuring the level of the environmental responsibility of agro-industrial enterprises on the basic formula of the Mandelbrot set and statistical indicator of Hurst. Our scientific contribution is the development of a modified methodology for assessing the quality of the activity of agro-industrial enterprises using the parameter characterizing the level of environmental ethics and environmental aggression of these entities. The main result of the study is the approbation of the method, which has shown its practical applicability and relative coherence with certain indicators of regional ecological statistics. The proposed method is characterized by the integration of the different mathematical approaches and as an adaptive assessment tool that can be used to assess the quality of the activity of both agro-industrial enterprises and enterprises of other industries and fields of the economy. In the further works, the authors plan to develop methodological approaches to the assessment of the quality of agro-industrial products. At the same time, the main attention will be paid to the ecological and social component of the quality.

  11. Methodological review of the quality of reach out and read: does it "work"?

    Science.gov (United States)

    Yeager Pelatti, Christina; Pentimonti, Jill M; Justice, Laura M

    2014-04-01

    A considerable percentage of American children and adults fail to learn adequate literacy skills and read below a third grade level. Shared book reading is perhaps the single most important activity to prepare young children for success in reading. The primary objective of this manuscript was to critically review the methodological quality of Read Out and Read (ROR), a clinically based literacy program/intervention that teaches parents strategies to incorporate while sharing books with children as a method of preventing reading difficulties and academic struggles. A PubMed search was conducted. Articles that met three criteria were considered. First, the study must be clinically based and include parent contact with a pediatrician. Second, parental counseling ("anticipatory guidance") about the importance of parent-child book reading must be included. Third, only experimental or quasi-experimental studies were included; no additional criteria were used. Published articles from any year and peer-reviewed journal were considered. Study quality was determined using a modified version of the Downs and Black (1998) checklist assessing four categories: (1) Reporting, (2) External Validity, (3) Internal Validity-Bias, and (4) Internal Validity-Confounding. We were also interested in whether quality differed based on study design, children's age, sample size, and study outcome. Eleven studies met the inclusion criteria. The overall quality of evidence was variable across all studies; Reporting and External Validity categories were relatively strong while methodological concerns were found in the area of internal validity. Quality scores differed on the four study characteristics. Implications related to clinical practice and future studies are discussed.

  12. Methodological quality of meta-analyses on treatments for chronic obstructive pulmonary disease: a cross-sectional study using the AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool.

    Science.gov (United States)

    Ho, Robin S T; Wu, Xinyin; Yuan, Jinqiu; Liu, Siya; Lai, Xin; Wong, Samuel Y S; Chung, Vincent C H

    2015-01-08

    Meta-analysis (MA) of randomised trials is considered to be one of the best approaches for summarising high-quality evidence on the efficacy and safety of treatments. However, methodological flaws in MAs can reduce the validity of conclusions, subsequently impairing the quality of decision making. To assess the methodological quality of MAs on COPD treatments. A cross-sectional study on MAs of COPD trials. MAs published during 2000-2013 were sampled from the Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effect. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. Seventy-nine MAs were sampled. Only 18% considered the scientific quality of primary studies when formulating conclusions and 49% used appropriate meta-analytic methods to combine findings. The problems were particularly acute among MAs on pharmacological treatments. In 48% of MAs the authors did not report conflict of interest. Fifty-eight percent reported harmful effects of treatment. Publication bias was not assessed in 65% of MAs, and only 10% had searched non-English databases. The methodological quality of the included MAs was disappointing. Consideration of scientific quality when formulating conclusions should be made explicit. Future MAs should improve on reporting conflict of interest and harm, assessment of publication bias, prevention of language bias and use of appropriate meta-analytic methods.

  13. THE QUALITY IMPROVEMENT OF PRIMER PACKAGING PROCESS USING SIX SIGMA METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Prima Ditahardiyani

    2008-01-01

    Full Text Available The implementation of Six Sigma has become a common theme in many organizations. This paper presents the Six Sigma methodology and its implementation in a primer packaging process of Cranberry drink. DMAIC (Define, Measure, Analyze, Improve and Control approach is used to analyze and to improve the primer packaging process, which have high variability and defects output. After the improvement, the results showed that there was an increasing sigma level. However, it is not significantly and has not achieved the world standard quality, yet. Therefore, the implementation of Six Sigma in primer packaging process of Cranberry drink still has a room for doing a further research.

  14. The methodological quality of systematic reviews comparing temporomandibular joint disorder surgical and non-surgical treatment

    Directory of Open Access Journals (Sweden)

    Vasconcelos Belmiro CE

    2008-09-01

    Full Text Available Abstract Background Temporomandibular joint disorders (TMJD are multifactor, complex clinical problems affecting approximately 60–70% of the general population, with considerable controversy about the most effective treatment. For example, reports claim success rates of 70% and 83% for non-surgical and surgical treatment, whereas other reports claim success rates of 40% to 70% for self-improvement without treatment. Therefore, the purpose of this study was to (1 identify systematic reviews comparing temporomandibular joint disorder surgical and non-surgical treatment, (2 evaluate their methodological quality, and (3 evaluate the evidence grade within the systematic reviews. Methods A search strategy was developed and implemented for MEDLINE, Cochrane Library, LILACS, and Brazilian Dentistry Bibliography databases. Inclusion criteria were: systematic reviews (± meta-analysis comparing surgical and non-surgical TMJD treatment, published in English, Spanish, Portuguese, Italian, or German between the years 1966 and 2007(up to July. Exclusion criteria were: in vitro or animal studies; narrative reviews or editorials or editorial letters; and articles published in other languages. Two investigators independently selected and evaluated systematic reviews. Three different instruments (AMSTAR, OQAQ and CASP were used to evaluate methodological quality, and the results averaged. The GRADE instrument was used to evaluate the evidence grade within the reviews. Results The search strategy identified 211 reports; of which 2 were systematic reviews meeting inclusion criteria. The first review met 23.5 ± 6.0% and the second met 77.5 ± 12.8% of the methodological quality criteria (mean ± sd. In these systematic reviews between 9 and 15% of the trials were graded as high quality, and 2 and 8% of the total number of patients were involved in these studies. Conclusion The results indicate that in spite of the widespread impact of TMJD, and the multitude of

  15. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    Science.gov (United States)

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate

  16. Water Quality Research Program: Development of Unstructured Grid Linkage Methodology and Software for CE-QUAL-ICM

    National Research Council Canada - National Science Library

    Chapman, Raymond

    1997-01-01

    This study was conducted for the purpose of developing a methodology and associated software for linking hydrodynamic output from the RMAlO finite element model to the CE-QUAL-ICM finite volume water quality model...

  17. Realizing rigor in the mathematics classroom

    CERN Document Server

    Hull, Ted H (Henry); Balka, Don S

    2014-01-01

    Rigor put within reach! Rigor: The Common Core has made it policy-and this first-of-its-kind guide takes math teachers and leaders through the process of making it reality. Using the Proficiency Matrix as a framework, the authors offer proven strategies and practical tools for successful implementation of the CCSS mathematical practices-with rigor as a central objective. You'll learn how to Define rigor in the context of each mathematical practice Identify and overcome potential issues, including differentiating instruction and using data

  18. The HAZOP methodology applied to the study of the quality and the productivity

    International Nuclear Information System (INIS)

    Angel G, J.C.

    1996-01-01

    This article makes reference to an adaptation of the method HAZOP, used in Administration of Risks, to the study and solution of problems related with the quality and the productivity of matters cousins, processes, products and services. The described methodology, it is based in the definition of, intentions, or objectives for each part of the process, sub-process, product or service, with the purpose of finding, deviations, or problems of quality or productivity with the use of words g uide . It thinks about that each deviation should be analyzed for the determination of its causes and consequences, with the purpose of defining the corrective pertinent actions. The work of interdisciplinary groups intends as an unavoidable requirement, the same as the will of its members to make the things better every day

  19. Quality assurance and quality control methodologies used within the austrian UV monitoring network

    International Nuclear Information System (INIS)

    Mario, B.

    2004-01-01

    The Austrian UVB monitoring network is operational since 1997. Nine detectors for measuring erythemally weighted solar UV irradiance are distributed over Austria in order to cover the main populated areas as well as different levels of altitude. The detectors are calibrated to indicate the UV-Index, the internationally agreed unit for erythemally weighted solar UV irradiance. Calibration is carried out in the laboratory for determination of spectral sensitivity of each detector, and under the sun for absolute comparison with a well-calibrated, double-monochromator spectro-radiometer. For the conversion from detector-weighted units to erythemally weighted units a lookup table is used, which is calculated using a radiative transfer model and which reflects the dependence of the conversion on the solar zenith angle and total ozone content of the atmosphere. The uncertainty of the calibration is about ±7%, dominated by the uncertainty of the calibration lamp for the spectro-radiometer (±4%). The long-term stability of this type of detectors has been found to be not satisfactory. Therefore, routinely every year all detectors are completely re-calibrated. Variations of the calibration factors up to ±10% are found. Thus, during routine operation, several measures take place for quality control. The measured data are compared to results of model calculations with a radiative transfer model, where clear sky and an aerosol-free atmosphere are assumed. At each site, the UV data are also compared with data from a co-located pyrano-meter measuring total solar irradiance. These two radiation quantities are well correlated, especially on clear days and when the ozone content is taken into account. If suspicious measurements are found for one detector in the network, a well-calibrated travelling reference detector of the same type is set up side-by-side, allowing the identification of relative differences of ∼3%. If necessary, a recalibration is carried out. As the main aim

  20. Evaluation of the coat quality of sustained release pellets by individual pellet dissolution methodology.

    Science.gov (United States)

    Xu, Min; Liew, Celine Valeria; Heng, Paul Wan Sia

    2015-01-15

    This study explored the application of 400-DS dissolution apparatus 7 for individual pellet dissolution methodology by a design of experiment approach and compared its capability with that of the USP dissolution apparatus 1 and 2 for differentiating the coat quality of sustained release pellets. Drug loaded pellets were prepared by extrusion-spheronization from powder blends comprising 50%, w/w metformin, 25%, w/w microcrystalline cellulose and 25%, w/w lactose, and then coated with ethyl cellulose to produce sustained release pellets with 8% and 10%, w/w coat weight gains. Various pellet properties were investigated, including cumulative drug release behaviours of ensemble and individual pellets. When USP dissolution apparatus 1 and 2 were used for drug release study of the sustained release pellets prepared, floating and clumping of pellets were observed and confounded the release profiles of the ensemble pellets. Hence, the release profiles obtained did not characterize the actual drug release from individual pellet and the applicability of USP dissolution apparatus 1 and 2 to evaluate the coat quality of sustained release pellets was limited. The cumulative release profile of individual pellet using the 400-DS dissolution apparatus 7 was found to be more precise at distinguishing differences in the applied coat quality. The dip speed and dip interval of the reciprocating holder were critical operational parameters of 400-DS dissolution apparatus 7 that affected the drug release rate of a sustained release pellet during the individual dissolution study. The individual dissolution methodology using the 400-DS dissolution apparatus 7 is a promising technique to evaluate the individual pellet coat quality without the influence of confounding factors such as pellet floating and clumping observed during drug release test with dissolution apparatus 1 and 2, as well as to facilitate the elucidation of the actual drug release mechanism conferred by the applied sustained

  1. A new transmission methodology for quality assurance in radiotherapy based on radiochromic film measurements

    Science.gov (United States)

    do Amaral, Leonardo L.; Pavoni, Juliana F.; Sampaio, Francisco; Netto, Thomaz Ghilardi

    2015-01-01

    Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source‐to‐detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92%±0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85%±0.26% and the mean percentage differences in the normalization point doses were −1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments. PACS number: 87.55.Qr, 87.55.km, 87.55.N‐ PMID

  2. Methodological Quality of Systematic Reviews Published in the Urological Literature from 1998 to 2012.

    Science.gov (United States)

    Corbyons, Katherine; Han, Julia; Neuberger, Molly M; Dahm, Philipp

    2015-11-01

    Systematic reviews synthesize the current best evidence to address a clinical question. Given the growing emphasis on evidence-based clinical practice, systematic reviews are being increasingly sought after and published. We previously reported limitations in the methodological quality of 57 individual systematic reviews published from 1998 to 2008. We provide an update to our previous study, adding systematic reviews published from 2009 to 2012. We systematically searched PubMed® and hand searched the table of contents of 4 major urological journals to identify systematic reviews related to questions of prevention and therapy. Two independent reviewers with prior formal evidence-based medicine training assessed the methodological quality using the validated 11-point AMSTAR (A Measurement Tool to Assess Systematic Reviews) instrument. We performed predefined statistical hypothesis testing for differences by publication period (1998 to 2008 vs 2009 to 2012) and journal of publication. We performed statistical testing using SPSS®, version 23.0 with a 2-sided α of 0.05 using the Student t-test, ANOVA and the chi-square test. A total of 113 systematic reviews published from 2009 to 2012 met study inclusion criteria. The most common topics were oncology (44 reviews or 38.9%), voiding dysfunction (26 or 23.0%) and stones/endourology (13 or 11.5%). The largest contributor was European Urology (46 reviews or 40.7%), followed by BJU International (31 or 27.4%) and The Journal of Urology® (22 or 19.5%). The mean ± SD AMSTAR score for the 2009 to 2012 period was 5.3 ± 2.3 compared to 4.8 ± 2.0 for 1998 to 2008 with a mean difference of 0.5 (95% CI 0.2 to 1.2, p = 0.133). While the number of systematic reviews published in the urological literature has increased substantially, the methodological quality of these studies remains suboptimal. Systematic review authors and editors should make every effort to adhere to well established methodological standards to enhance

  3. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    Science.gov (United States)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population

  4. A methodology for texture feature-based quality assessment in nucleus segmentation of histopathology image

    Directory of Open Access Journals (Sweden)

    Si Wen

    2017-01-01

    Full Text Available Context: Image segmentation pipelines often are sensitive to algorithm input parameters. Algorithm parameters optimized for a set of images do not necessarily produce good-quality-segmentation results for other images. Even within an image, some regions may not be well segmented due to a number of factors, including multiple pieces of tissue with distinct characteristics, differences in staining of the tissue, normal versus tumor regions, and tumor heterogeneity. Evaluation of quality of segmentation results is an important step in image analysis. It is very labor intensive to do quality assessment manually with large image datasets because a whole-slide tissue image may have hundreds of thousands of nuclei. Semi-automatic mechanisms are needed to assist researchers and application developers to detect image regions with bad segmentations efficiently. Aims: Our goal is to develop and evaluate a machine-learning-based semi-automated workflow to assess quality of nucleus segmentation results in a large set of whole-slide tissue images. Methods: We propose a quality control methodology, in which machine-learning algorithms are trained with image intensity and texture features to produce a classification model. This model is applied to image patches in a whole-slide tissue image to predict the quality of nucleus segmentation in each patch. The training step of our methodology involves the selection and labeling of regions by a pathologist in a set of images to create the training dataset. The image regions are partitioned into patches. A set of intensity and texture features is computed for each patch. A classifier is trained with the features and the labels assigned by the pathologist. At the end of this process, a classification model is generated. The classification step applies the classification model to unlabeled test images. Each test image is partitioned into patches. The classification model is applied to each patch to predict the patch

  5. Use of Proteomic Methodology in Optimization of Processing and Quality Control of Food of Animal Origin

    Directory of Open Access Journals (Sweden)

    Dajana Gašo-Sokač

    2011-01-01

    Full Text Available Food of animal origin, namely meat, seafood, milk and milk products, is the main protein source in human nutrition. These types of food are very complex mixtures that contain proteins and other components, and proteomic techniques enable simultaneous study of several hundred up to several thousand proteins. The use of proteomic methodology for quality control and quality assessment in production as well as for the optimization and development of new manufacturing processes is presented. Newly developed, faster and more selective methods for sample preparation followed by more sensitive mass spectrometry for identification of less abundant proteins are discussed. These techniques will help to understand variations in production, and to find markers for food quality criteria. Furthermore, biologically active peptides in food of animal origin have recently been the focus of proteomic and peptidomic investigations. Isolation and production of biologically active proteins and peptides, including the low abundance ones, will also be a focus of future research. The use of proteomics, peptidomics and metabonomics for the determination of product quality and the detection of adulterations in meat production, seafood identification and in the production of milk and milk products is also discussed.

  6. Evaluation of the quality of the environmental participation: A methodological proposal

    International Nuclear Information System (INIS)

    Zuluaga M, Clara; Carmona M, Sergio Ivan

    2004-01-01

    The advances in the way to sustainability are inseparable to the achievement in the citizenship construction, because the citizen condition is only realized in the proactive compromise with the territorial themes, the environmental management effectiveness requests high quality in their participative processes; therefore, pertinent tools are required to know and to appraise these processes. The goodness of these tools proceeds of their functionality in the knowledge of the participation quality purpose, that to environmental participative processes, is conceived in terms of legitimacy, representatively, democratization of the environmental knowledge, social cohesion, capacity of interlocution, and incidence in the decision making, with the coherent conceptual structure of these facets shapes the theoretical-methodological scaffolding that permits their joining in attributes, variables and indicators relatives to the characteristics of the participative planning processes, proper to account of the environmental participation quality. With the appraisal of the environmental participation quality in planning processes, supported in the integration of the constitutive attributes, is possible to obtain the index that facilitates their diagnostic and improvement

  7. Development of a quality management system for borehole investigations. (1) Quality assurance and quality control methodology for hydraulic packer testing

    International Nuclear Information System (INIS)

    Takeuchi, Shinji; Kunimaru, Takanori; Ota, Kunio; Frieg, Bernd

    2011-01-01

    A quality assurance and quality control (QA/QC) system for the hydraulic packer tests has been established based on the surface-based investigations at JAEA's underground research laboratories in Mizunami and Horonobe. The established QA/QC system covers field investigations (data acquisition) and data analysis. For the field investigations, the adopted procedure is selection of a test section based on a detail fluid logging and checking with tally list, followed by inspection of test tools such as pressure transducers and shut-in valves, etc., test method selection using a 'sequential hydraulic test' for deciding appropriate method, and finally data quality confirmation by pressure changes and derivatives on a log-log plots during testing. Test event logs should also be described during testing for traceability. For the test data analysis, a quick analysis for rough estimation of hydraulic parameters, and a detailed analysis using type curve and/or numerical analyses are conducted stepwise. The established QA/QC system has been applied to the recent borehole investigations and its efficiency has been confirmed. (author)

  8. СONTENTS OF THE METHODOLOGICAL AND TECHNOLOGICAL SUPPORT OF THE EDUCATION QUALITY MANAGEMENT INFORMATION SYSTEM FOR FUTURE ECONOMISTS

    Directory of Open Access Journals (Sweden)

    Kostiantyn S. Khoruzhyi

    2014-12-01

    Full Text Available In the article, the content and nature of organizational activities in scope of methodological and technological support of the education quality management information system (EQMIS for future economists are described. The content of the organizational activities for the implementation of methodological and technological support of EQMIS for future economists includes four stages (preparatory, instructional/adaptational, methodological/basic, as well as experimental/evaluational and contains a set of methodological and technological measures for each of the stages of the EQMIS implementation. A study of the pedagogical impact of the proposed methodology of using EQMIS in the formation of professional competence of economics students was also conducted. The main stages, methods and sequence of implementation arrangements for the methodological and technological support of EQMIS are defined.

  9. Abstract analysis method facilitates filtering low-methodological quality and high-bias risk systematic reviews on psoriasis interventions.

    Science.gov (United States)

    Gómez-García, Francisco; Ruano, Juan; Aguilar-Luque, Macarena; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesús; Hernández-Romero, José Luis; Sanz-Cabanillas, Juan Luis; Maestre-López, Beatriz; González-Padilla, Marcelino; Carmona-Fernández, Pedro J; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-29

    Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a

  10. [What is the methodological quality of articles on therapeutic procedures published in Cirugía Española?].

    Science.gov (United States)

    Manterola, Carlos; Busquets, Juli; Pascual, Marta; Grande, Luis

    2006-02-01

    The aim of this study was to determine the methodological quality of articles on therapeutic procedures published in Cirugía Española and to study its association with the publication year, center, and subject-matter. A bibliometric study that included all articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 was performed. All kinds of clinical designs were considered, excluding editorials, review articles, letters to editor, and experimental studies. The variables analyzed were: year of publication, center, design, and methodological quality. Methodological quality was determined by a valid and reliable scale. Descriptive statistics (calculation of means, standard deviation and medians) and analytical statistics (Pearson's chi2, nonparametric, ANOVA and Bonferroni tests) were used. A total of 244 articles were studied (197 case series [81%], 28 cohort studies [12%], 17 clinical trials [7%], 1 cross sectional study and 1 case-control study [0.8%]). The studies were performed mainly in Catalonia and Murcia (22% and 16%, respectively). The most frequent subject areas were soft tissue and hepatobiliopancreatic surgery (23% and 19%, respectively). The mean and median of the methodological quality score calculated for the entire series was 10.2 +/- 3.9 points and 9.5 points, respectively. Methodological quality significantly increased by publication year (p < 0.001). An association between methodological quality and subject area was observed but no association was detected with the center performing the study. The methodological quality of articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 is low. However, a statistically significant trend toward improvement was observed.

  11. Comparison of methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles.

    Science.gov (United States)

    Schneider, Barbara St Pierre; Nicholas, Jennifer; Kurrus, Jeffrey E

    2013-01-01

    To compare the methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles. The methodologic quality of quantitative nursing education research needs to advance to a higher level. Clinical research can provide guidance for nursing education to reach this level. One hundred quantitative clinical research articles from-high impact journals published in 2007 and 37 education research articles from high impact journals published in 2006 to 2007 were chosen for analysis. Clinical articles had significantly higher quality scores than education articles in three domains: number of institutions studied, type of data, and outcomes. The findings indicate three ways in which nursing education researchers can strengthen the methodologic quality of their quantitative research. With this approach, greater funding may be secured for advancing the science of nursing education.

  12. Inkjet printed large-area flexible circuits: a simple methodology for optimizing the printing quality

    Science.gov (United States)

    Cheng, Tao; Wu, Youwei; Shen, Xiaoqin; Lai, Wenyong; Huang, Wei

    2018-01-01

    In this work, a simple methodology was developed to enhance the patterning resolution of inkjet printing, involving process optimization as well as substrate modification and treatment. The line width of the inkjet-printed silver lines was successfully reduced to 1/3 of the original value using this methodology. Large-area flexible circuits with delicate patterns and good morphology were thus fabricated. The resultant flexible circuits showed excellent electrical conductivity as low as 4.5 Ω/□ and strong tolerance to mechanical bending. The simple methodology is also applicable to substrates with various wettability, which suggests a general strategy to enhance the printing quality of inkjet printing for manufacturing high-performance large-area flexible electronics. Project supported by the National Key Basic Research Program of China (Nos. 2014CB648300, 2017YFB0404501), the National Natural Science Foundation of China (Nos. 21422402, 21674050), the Natural Science Foundation of Jiangsu Province (Nos. BK20140060, BK20130037, BK20140865, BM2012010), the Program for Jiangsu Specially-Appointed Professors (No. RK030STP15001), the Program for New Century Excellent Talents in University (No. NCET-13-0872), the NUPT "1311 Project" and Scientific Foundation (Nos. NY213119, NY213169), the Synergetic Innovation Center for Organic Electronics and Information Displays, the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD), the Leading Talent of Technological Innovation of National Ten-Thousands Talents Program of China, the Excellent Scientific and Technological Innovative Teams of Jiangsu Higher Education Institutions (No. TJ217038), the Program for Graduate Students Research and Innovation of Jiangsu Province (No. KYZZ16-0253), and the 333 Project of Jiangsu Province (Nos. BRA2017402, BRA2015374).

  13. Report on use of a methodology for commissioning and quality assurance of a VMAT system.

    Directory of Open Access Journals (Sweden)

    Charles Mayo

    Full Text Available INTRODUCTION: Results of use of methodology for VMAT commissioning and quality assurance, utilizing both control point tests and dosimetric measurements are presented. METHODS AND MATERIALS: A generalizable, phantom measurement approach is used to characterize the accuracy of the measurement system. Correction for angular response of the measurement system and inclusion of couch structures are used to characterize the full range gantry angles desirable for clinical plans. A dose based daily QA measurement approach is defined. RESULTS: Agreement in the static vs. VMAT picket fence control point test was better than 0.5 mm. Control point tests varying gantry rotation speed, leaf speed and dose rate, demonstrated agreement with predicted values better than 1%. Angular dependence of the MatriXX array, varied over a range of 0.94-1.06, with respect to the calibration condition. Phantom measurements demonstrated central axis dose accuracy for un-modulated four field box plans was ≥2.5% vs. 1% with and without angular correction respectively with better results for VMAT (0.4% vs. IMRT (1.6% plans. Daily QA results demonstrated average agreement all three chambers within 0.4% over 9 month period with no false positives at a 3% threshold. DISCUSSION: The methodology described is simple in design and characterizes both the inherit limitations of the measurement system as well at the dose based measurements that may be directly related to patient plan QA.

  14. Investigation Of Infrared Drying Behaviour Of Spinach Leaves Using ANN Methodology And Dried Product Quality

    Directory of Open Access Journals (Sweden)

    Sarimeseli Ayse

    2015-12-01

    Full Text Available Effects of infrared power output and sample mass on drying behaviour, colour parameters, ascorbic acid degradation, rehydration characteristics and some sensory scores of spinach leaves were investigated. Within both of the range of the infrared power outputs, 300–500 W, and sample amounts, 15–60 g, moisture content of the leaves was reduced from 6.0 to 0.1±(0.01 kg water/kg dry base value. It was recorded that drying times of the spinach leaves varied between 3.5–10 min for constant sample amount, and 4–16.5 min for constant power output. Experimental drying data obtained were successfully investigated by using artificial neural network methodology. Some changes were recorded in the quality parameters of the dried leaves, and acceptable sensory scores for the dried leaves were observed in all of the experimental conditions.

  15. Methodological quality of systematic reviews in subfertility: a comparison of two different approaches.

    Directory of Open Access Journals (Sweden)

    Ivor Popovich

    Full Text Available BACKGROUND: Systematic reviews are used widely to guide health care decisions. Several tools have been created to assess systematic review quality. The measurement tool for assessing the methodological quality of systematic reviews known as the AMSTAR tool applies a yes/no score to eleven relevant domains of review methodology. This tool has been reworked so that each domain is scored based on a four point scale, producing R-AMSTAR. METHODS AND FINDINGS: We aimed to compare the AMSTAR and R-AMSTAR tools in assessing systematic reviews in the field of assisted reproduction for subfertility. All published systematic reviews on assisted reproductive technology, with the latest search for studies taking place from 2007-2011, were considered. Reviews that contained no included studies or considered diagnostic outcomes were excluded. Thirty each of Cochrane and non-Cochrane reviews were randomly selected from a search of relevant databases. Both tools were then applied to all sixty reviews. The results were converted to percentage scores and all reviews graded and ranked based on this. AMSTAR produced a much wider variation in percentage scores and achieved higher inter-rater reliability than R-AMSTAR according to kappa statistics. The average rating for Cochrane reviews was consistent between the two tools (88.3% for R-AMSTAR versus 83.6% for AMSTAR but inconsistent for non-Cochrane reviews (63.9% R-AMSTAR vs. 38.5% AMSTAR. In comparing the rankings generated between the two tools Cochrane reviews changed an average of 4.2 places, compared to 2.9 for non-Cochrane. CONCLUSION: R-AMSTAR provided greater guidance in the assessment of domains and produced quantitative results. However, there were many problems with the construction of its criteria and AMSTAR was much easier to apply consistently. We recommend that AMSTAR incorporates the findings of this study and produces additional guidance for its application in order to improve its reliability and

  16. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    Science.gov (United States)

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  17. Rigorous Science: a How-To Guide

    Directory of Open Access Journals (Sweden)

    Arturo Casadevall

    2016-11-01

    Full Text Available Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education.

  18. Reducing DNACPR complaints to zero: designing and implementing a treatment escalation plan using quality improvement methodology.

    Science.gov (United States)

    Shermon, Elizabeth; Munglani, Laura; Oram, Sarah; William, Linda; Abel, Julian

    2017-01-01

    Do Not Attempt Resuscitation (DNAR)decisions have traditionally formed the basis of ceiling of care discussions. However, poor quality discussions can lead to high patient and relative dissatisfaction, generating hospital complaints. Treatment escalation plans (TEPs) aim to highlight the wider remit of treatment options with a focus on effective communication. We aimed to improve TEP discussions and documentation at Weston General Hospital by introducing a standardised form. We aimed to develop a TEP document to reduce resuscitation-related complaints by improving communication and documentation. Qualitative and quantitative data were collected over 2 years and used to develop plan-do-study-act (PDSA) cycles using quality improvement methodology. Main barriers to improvement included time constraints and clinician's resistance. Analysis of patient liaison services data showed a progressive reduction in complaints regarding resuscitation, with no complaints having been received for the final six months of the project. Through use of a standardised form including treatment prompts, the quality of discussions and plans improved. Qualitative feedback demonstrated increased patient and relative satisfaction. In addition, junior doctors report the plans are helpful when making out-of-hours decisions. Development of a user-friendly form to document patient-guided TEPs helped junior doctors to lead advanced care planning discussions. The use of PDSA cycles demonstrated improvement in the quality of forms, which in turn improved communication, documentation and satisfaction. Future developments could include involvement of specialist teams to ensure TEP forms remain relevant to all clinical areas. In addition, with widespread use of the TEP forms, the traditional tick-box DNAR could be replaced to focus on patient-led care planning.

  19. [Efficiency versus quality in the NHS, in Portugal: methodologies for evaluation].

    Science.gov (United States)

    Giraldes, Maria do Rosário

    2008-01-01

    To proceed to the evaluation of the efficiency and quality in the NHS, based in methodologies of evaluation of management, indicators of benchmarking and indicators of process and outcome. The 1980 and 1990 decades have seen the proliferation of all forms of process indicators as a way to control health services. It is not a coincidence that the increase in managed care has been accompanied by an explosion of process indicators, as it has happened in the health system of the USA. More recently the attention has turned away from measures of performance, which measure the process (what has been done) to those which measure outcomes (what was the result). Quality indicators have been developed in Europe, first to be used in hospitals, but also to be used in primary health care. Conceptually the justification for the introduction of process indicators comes from the principle that their use will reinforce a modification in the quality of the proceedings, which will give origin to better outcomes as well at population level, as resource saving. Outcome indicators compared with process indicators in health care shows that process indicators have the advantage of being more sensitive than outcome indicators to differences in the quality. Optimizing health care quality has the objective of establishing a quantitative relationship between the quality of the health services and cost-effectiveness. To identify quality indicators and benchmarking and to implement plans to measure the quality of health care. In a study made in a group of senior GP, in the UK, with the objective of determining which process indicators better reflect the quality of the services in primary health care services a Delphi method was used. Only seven indicators were chosen by 75% of the respondents: the percentage of eligible patients receiving cervical screening; the percentage of generic prescribing; the percentage of eligible patients receiving childhood immunization; the percentage of eligible

  20. Clinical audit, a valuable tool to improve quality of care: General methodology and applications in nephrology

    Science.gov (United States)

    Esposito, Pasquale; Dal Canton, Antonio

    2014-01-01

    Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings. PMID:25374819

  1. Methodological and Statistical Quality in Research Evaluating Nutritional Attitudes in Sports.

    Science.gov (United States)

    Kouvelioti, Rozalia; Vagenas, George

    2015-12-01

    The assessment of dietary attitudes and behaviors provides information of interest to sports nutritionists. Although there has been little analysis of the quality of research undertaken in this field, there is evidence of a number of flaws and methodological concerns in some of the studies in the available literature. This review undertook a systematic assessment of the attributes of research assessing the nutritional knowledge and attitudes of athletes and coaches. Sixty questionnaire-based studies were identified by a search of official databases using specific key terms with subsequent analysis by certain inclusion-exclusion criteria. These studies were then analyzed using 33 research quality criteria related to the methods, questionnaires, and statistics used. We found that many studies did not provide information on critical issues such as research hypotheses (92%), the gaining of ethics approval (50%) or informed consent (35%), or acknowledgment of limitations in the implementation of studies or interpretation of data (72%). Many of the samples were nonprobabilistic (85%) and rather small (42%). Many questionnaires were of unknown origin (30%), validity (72%), and reliability (70%) and resulted in low (≤ 60%) response rates (38%). Pilot testing was not undertaken in 67% of the studies. Few studies dealt with sample size (2%), power (3%), assumptions (7%), confidence intervals (3%), or effect sizes (3%). Improving some of these problems and deficits may enhance future research in this field.

  2. Multi-attribute Evaluation of Website Quality in E-business Using an Integrated Fuzzy AHPTOPSIS Methodology

    Directory of Open Access Journals (Sweden)

    Tolga Kaya

    2010-09-01

    Full Text Available Success of an e-business company is strongly associated with the relative quality of its website compared to that of its competitors. The purpose of this study is to propose a multi-attribute e-business website quality evaluation methodology based on a modified fuzzy TOPSIS approach. In the proposed methodology, weights of the evaluation criteria are generated by a fuzzy AHP procedure. In performance evaluation problems, the judgments of the experts may usually be vague in form. As fuzzy logic can successfully deal with this kind of uncertainty in human preferences, both classical TOPSIS and classical AHP procedures are implemented under fuzzy environment. The proposed TOPSIS-AHP methodology has successfully been applied to a multi-attribute website quality evaluation problem in Turkish e-business market. Nine sub-criteria under four main categories are used in the evaluation of the most popular e-business websites of Turkey. A sensitivity analysis is also provided.

  3. Social control of the quality of public services: Theory, methodology and results of empirical research

    Directory of Open Access Journals (Sweden)

    Evgeny A. Kapoguzov

    2017-06-01

    Full Text Available The article reveals the theoretical and methodological aspect of the problem of social control in relation to the possibility of its implementation in the production of public services. The interdisciplinary nature of the discourse on the nature of social control is presented, the evolution of ideas about it in the framework of social science concepts is presented, and the relationship with related categories is revealed, in particular, "public control", "civil control". The evolution of essence is also traced the category "institutionalization", it is shown the lack of unambiguousness in its interpretation. The normative value of the institutionalization of social practices in the implementation of institutional design is presented, in particular, with regard to the improvement of the provision of public services. The barriers of institutionalization of social control (resource, information, institutional for quality of public services are characterized. The results of a mass survey of consumers of public services conducted in December 2016 in the Multifunctional Center (MFC of city Omsk are presented. Unlike other surveys and publications that only assess the level of customer satisfaction and do not give a detailed explanation of the attitude of consumers to the ongoing institutional changes, this paper presents an analysis of consumer attitudes and beliefs to meaningful attributes of the quality of public services on the one hand, and for various institutional alternatives of influence on the quality of public services on the other. According to the results of the mass survey, the low readiness for social action was established due to high transaction costs, the rational ignorance and a free-rider problem. The possibility of institutionalizing the practice of social action and setting up consumers for the creation of a specialized organization for the protection of consumer rights in the production of public services was discussed.

  4. Development of a calibration methodology for instruments used to interventional radiology quality control

    International Nuclear Information System (INIS)

    Miranda, Jurema Aparecida de

    2009-01-01

    Interventional radiology is the technique where X radiation images are used as a tool in the conduction of diagnostic or/and therapeutic procedures. The exposition times are long for both procedures, diagnostic and therapeutic, may cause serious injuries in the patient, and also contribute to the dose of the clinical staff. In Brazil there are not yet well established rules to determine the doses and to make the dosimetry in fluoroscopic beams. There is great interest in this study, in relation to the beam quality, the half-value-layer, and others parameters. In this work a Medicor Neo Diagnomax clinical X ray generator, fluoroscopy mode, was used to develop a calibration methodology for instruments used in interventional radiology quality control. One plane parallel ionization chamber PTW was used as monitor. The ionization chambers recommended for fluoroscopy measurements had been evaluated and calibrated in relation to the IPEN Calibration Laboratory reference ionization chamber. The RQR3, RQR5 and RQR7 radiation qualities and the specific ones for fluoroscopy, RQC3, RQC5 and RQC7, were established following the norm IEC 61267. All beams characteristics were determined. Ionization chambers positioning system and the acrylic phantoms to the entrance and exit doses determination were developed and constructed. The results obtained show air kerma rates of 4.5x10 -3 , 1.2x10 -2 and 1.9x10 -2 Gy/min for RQC3, RQC5 and RQC7 respectively. Tests with and without the collimation just after the monitor chamber, were carried out and the results showed a difference of +5.5%, +0.6% e + 0.8%, confirming the importance of the collimation use in these interventionist procedures. (author)

  5. THE ARTHRITIS AND MUSCULOSKELETAL QUALITY IMPROVEMENT PROGRAM (AMQUIP: A BREAKTHROUGH SERIES METHODOLOGY PROJECT

    Directory of Open Access Journals (Sweden)

    MASTURA I

    2008-01-01

    Full Text Available The Australian government had funded the National Primary Care Collaborative (NPCC program with funding of $14.6 million over three years. One of the pilots project was the Arthritis and Musculoskeletal Quality Improvement Program (AMQuIP.The study aims to optimize general practitioners (GPs management of patients with osteoarthritis (OA of the hip and knee by identifying gaps between their current practice and best practice. The Breakthrough Series Collaborative methodology with several Plan-Do-Study-Act (PDSA cycles was employed. Participants comprises of 12 GPs/practices from two Victorian Divisions of general Practice (one rural, one metropolitan with 10 patients per GP/practice. GPs/practices attended an orientation and three learning workshops and a videoconference. GPs/practices completed PDSA cycles between workshop and reported results at workshops. GPs/practices reported use of guidelines, change in patient management and change in practice management/systems. All recruited patients completed the SF-12v2 Health Survey and WOMAC OA Index Questionnaire twice. Follow up activities including focus groups and face-to-face interviews were held six months after the final workshop. All GPs/practices used the guidelines/key messages, introduced “new” management strategies to patients, and made positive changes to their practice management/systems. Patient reported positive changes and outcomes. By using a structured methodology and evidence-based guidelines/key messages; GPs can introduce new patient management strategies, and by identifying gaps in practice management systems, positive changes can be achieved.

  6. Social exclusion in academia through biases in methodological quality evaluation: On the situation of women in science and philosophy.

    Science.gov (United States)

    Leuschner, Anna

    2015-12-01

    Empirical studies show that academia is socially exclusive. I argue that this social exclusion works, at least partly, through the systematic methodological disqualification of contributions from members of underrepresented social groups. As methodological quality criteria are underdetermined their interpretation and weighting can be biased with relation to gender, race, social background, etc. Such biased quality evaluation can take place on a local or global level. The current situation of women in academic philosophy illuminates this. I conclude that only mechanical solutions can effectively change the situation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist.

    Science.gov (United States)

    Terwee, Caroline B; Mokkink, Lidwine B; Knol, Dirk L; Ostelo, Raymond W J G; Bouter, Lex M; de Vet, Henrica C W

    2012-05-01

    The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. The scoring system was developed based on discussions among experts and testing of the scoring system on 46 articles from a systematic review. Four response options were defined for each COSMIN item (excellent, good, fair, and poor). A quality score per measurement property is obtained by taking the lowest rating of any item in a box ("worst score counts"). Specific criteria for excellent, good, fair, and poor quality for each COSMIN item are described. In defining the criteria, the "worst score counts" algorithm was taken into consideration. This means that only fatal flaws were defined as poor quality. The scores of the 46 articles show how the scoring system can be used to provide an overview of the methodological quality of studies included in a systematic review of measurement properties. Based on experience in testing this scoring system on 46 articles, the COSMIN checklist with the proposed scoring system seems to be a useful tool for assessing the methodological quality of studies included in systematic reviews of measurement properties.

  8. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  9. Methodological and Reporting Quality of Comparative Studies Evaluating Health-Related Quality of Life of Colorectal Cancer Patients and Controls: A Systematic Review.

    Science.gov (United States)

    Wong, Carlos K H; Guo, Vivian Y W; Chen, Jing; Lam, Cindy L K

    2016-11-01

    Health-related quality of life is an important outcome measure in patients with colorectal cancer. Comparison with normative data has been increasingly undertaken to assess the additional impact of colorectal cancer on health-related quality of life. This review aimed to critically appraise the methodological details and reporting characteristics of comparative studies evaluating differences in health-related quality of life between patients and controls. A systematic search of English-language literature published between January 1985 and May 2014 was conducted through a database search of PubMed, Web of Science, Embase, and Medline. Comparative studies reporting health-related quality-of-life outcomes among patients who have colorectal cancer and controls were selected. Methodological and reporting quality per comparison study was evaluated based on a 11-item methodological checklist proposed by Efficace in 2003 and a set of criteria predetermined by reviewers. Thirty-one comparative studies involving >10,000 patients and >10,000 controls were included. Twenty-three studies (74.2%) originated from European countries, with the largest number from the Netherlands (n = 6). Twenty-eight studies (90.3%) compared the health-related quality of life of patients with normative data published elsewhere, whereas the remaining studies recruited a group of patients who had colorectal cancer and a group of control patients within the same studies. The European Organisation for Research and Treatment of Cancer Quality-of-Life Questionnaire Core 30 was the most extensively used instrument (n = 16; 51.6%). Eight studies (25.8%) were classified as "probably robust" for clinical decision making according to the Efficace standard methodological checklist. Our further quality assessment revealed the lack of score differences reported (61.3%), contemporary comparisons (36.7%), statistical significance tested (38.7%), and matching of control group (58.1%), possibly leading to

  10. Comparison of methodological quality rating of systematic reviews on neuropathic pain using AMSTAR and R-AMSTAR.

    Science.gov (United States)

    Dosenovic, Svjetlana; Jelicic Kadic, Antonia; Vucic, Katarina; Markovina, Nikolina; Pieper, Dawid; Puljak, Livia

    2018-05-08

    Systematic reviews (SRs) in the field of neuropathic pain (NeuP) are increasingly important for decision-making. However, methodological flaws in SRs can reduce the validity of conclusions. Hence, it is important to assess the methodological quality of NeuP SRs critically. Additionally, it remains unclear which assessment tool should be used. We studied the methodological quality of SRs published in the field of NeuP and compared two assessment tools. We systematically searched 5 electronic databases to identify SRs of randomized controlled trials of interventions for NeuP available up to March 2015. Two independent reviewers assessed the methodological quality of the studies using the Assessment of Multiple Systematic Reviews (AMSTAR) and the revised AMSTAR (R-AMSTAR) tools. The scores were converted to percentiles and ranked into 4 grades to allow comparison between the two checklists. Gwet's AC1 coefficient was used for interrater reliability assessment. The 97 included SRs had a wide range of methodological quality scores (AMSTAR median (IQR): 6 (5-8) vs. R-AMSTAR median (IQR): 30 (26-35)). The overall agreement score between the 2 raters was 0.62 (95% CI 0.39-0.86) for AMSTAR and 0.62 (95% CI 0.53-0.70) for R-AMSTAR. The 31 Cochrane systematic reviews (CSRs) were consistently ranked higher than the 66 non-Cochrane systematic reviews (NCSRs). The analysis of individual domains showed the best compliance in a comprehensive literature search (item 3) on both checklists. The results for the domain that was the least compliant differed: conflict of interest (item 11) was the item most poorly reported on AMSTAR vs. publication bias assessment (item 10) on R-AMSTAR. A high positive correlation between the total AMSTAR and R-AMSTAR scores for all SRs, as well as for CSRs and NCSRs, was observed. The methodological quality of analyzed SRs in the field of NeuP was not optimal, and CSRs had a higher quality than NCSRs. Both AMSTAR and R-AMSTAR tools produced comparable

  11. Quality control methodology for high-throughput protein-protein interaction screening.

    Science.gov (United States)

    Vazquez, Alexei; Rual, Jean-François; Venkatesan, Kavitha

    2011-01-01

    Protein-protein interactions are key to many aspects of the cell, including its cytoskeletal structure, the signaling processes in which it is involved, or its metabolism. Failure to form protein complexes or signaling cascades may sometimes translate into pathologic conditions such as cancer or neurodegenerative diseases. The set of all protein interactions between the proteins encoded by an organism constitutes its protein interaction network, representing a scaffold for biological function. Knowing the protein interaction network of an organism, combined with other sources of biological information, can unravel fundamental biological circuits and may help better understand the molecular basics of human diseases. The protein interaction network of an organism can be mapped by combining data obtained from both low-throughput screens, i.e., "one gene at a time" experiments and high-throughput screens, i.e., screens designed to interrogate large sets of proteins at once. In either case, quality controls are required to deal with the inherent imperfect nature of experimental assays. In this chapter, we discuss experimental and statistical methodologies to quantify error rates in high-throughput protein-protein interactions screens.

  12. Evaluation of methodologies for assessing the overall diet: dietary quality scores and dietary pattern analysis.

    Science.gov (United States)

    Ocké, Marga C

    2013-05-01

    This paper aims to describe different approaches for studying the overall diet with advantages and limitations. Studies of the overall diet have emerged because the relationship between dietary intake and health is very complex with all kinds of interactions. These cannot be captured well by studying single dietary components. Three main approaches to study the overall diet can be distinguished. The first method is researcher-defined scores or indices of diet quality. These are usually based on guidelines for a healthy diet or on diets known to be healthy. The second approach, using principal component or cluster analysis, is driven by the underlying dietary data. In principal component analysis, scales are derived based on the underlying relationships between food groups, whereas in cluster analysis, subgroups of the population are created with people that cluster together based on their dietary intake. A third approach includes methods that are driven by a combination of biological pathways and the underlying dietary data. Reduced rank regression defines linear combinations of food intakes that maximally explain nutrient intakes or intermediate markers of disease. Decision tree analysis identifies subgroups of a population whose members share dietary characteristics that influence (intermediate markers of) disease. It is concluded that all approaches have advantages and limitations and essentially answer different questions. The third approach is still more in an exploration phase, but seems to have great potential with complementary value. More insight into the utility of conducting studies on the overall diet can be gained if more attention is given to methodological issues.

  13. "Rigor mortis" in a live patient.

    Science.gov (United States)

    Chakravarthy, Murali

    2010-03-01

    Rigor mortis is conventionally a postmortem change. Its occurrence suggests that death has occurred at least a few hours ago. The authors report a case of "Rigor Mortis" in a live patient after cardiac surgery. The likely factors that may have predisposed such premortem muscle stiffening in the reported patient are, intense low cardiac output status, use of unusually high dose of inotropic and vasopressor agents and likely sepsis. Such an event may be of importance while determining the time of death in individuals such as described in the report. It may also suggest requirement of careful examination of patients with muscle stiffening prior to declaration of death. This report is being published to point out the likely controversies that might arise out of muscle stiffening, which should not always be termed rigor mortis and/ or postmortem.

  14. 40 CFR Appendix D to Part 132 - Great Lakes Water Quality Initiative Methodology for the Development of Wildlife Criteria

    Science.gov (United States)

    2010-07-01

    ... Methodology for the Development of Wildlife Criteria D Appendix D to Part 132 Protection of Environment... Development of Wildlife Criteria Great Lakes States and Tribes shall adopt provisions consistent with (as protective as) this appendix. I. Introduction A. A Great Lakes Water Quality Wildlife Criterion (GLWC) is the...

  15. Review and evaluation of the methodological quality of the existing guidelines and recommendations for inherited neurometabolic disorders

    DEFF Research Database (Denmark)

    Cassis, Linda; Cortès-Saladelafont, Elisenda; Molero-Luis, Marta

    2015-01-01

    and timely treatments are often pivotal for the favorable course of the disease. Thus, the elaboration of new evidence-based recommendations for iNMD diagnosis and management is increasingly requested by health care professionals and patients, even though the methodological quality of existing guidelines...

  16. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist

    NARCIS (Netherlands)

    Terwee, C.B.; Mokkink, L.B.; Knol, D.L.; Ostelo, R.W.J.G.; Bouter, L.M.; de Vet, H.C.W.

    2012-01-01

    Background: The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a

  17. [Needs assessment to improve the applicability and methodological quality of a German S3 guideline].

    Science.gov (United States)

    Burckhardt, Marion; Hoffmann, Cristina; Nink-Grebe, Brigitte; Sänger, Sylvia

    2018-04-01

    Clinical practice guidelines can change the practice in healthcare only if their recommendations are implemented in a comprehensive way. The German S3 guideline "Local Therapy of Chronic Wounds in Patients with Peripheral Vascular Disease, Chronic Venous Insufficiency, and Diabetes" will be updated in 2017. The emphasis here is on the guideline's validity, user-friendliness and implementation into practice. Therefore, the aim was to identify the improvements required in regard to the guideline's methods and content presentation. The methodological approach used was the critical appraisal of the guideline according to established quality criteria and an additional stakeholder survey. Both were conducted between August and November 2016. The guideline and its related documents were reviewed independently by two researchers according to the criteria of the "Appraisal of Guidelines for Research and Evaluation" (AGREE-II). Published reviews and peer reviews by external experts and organisations were also taken into account. For the stakeholder survey, a questionnaire with open questions was distributed by e-mail and via the Internet to health professionals and organisations involved in the care of patients with leg ulcers in Germany. The questions were aimed at amendments and new topics based on the stakeholders' experience in inpatient and outpatient care. In addition, the survey focused on gathering suggestions to improve the applicability of the guideline. Suggested new topics and amendments were summarised thematically. The stakeholders' suggestions to improve the applicability, the results of the critical appraisal and the relevant aspects of the external reviews were then summarised according to the AGREE-II domains and presented in a cause and effect diagram. 17 questionnaires (out of 864 sent out by e-mail) were returned. Due to high practice relevance, the stakeholders suggested an expansion of the inclusion criteria to patients with infected wounds and

  18. An ultramicroscopic study on rigor mortis.

    Science.gov (United States)

    Suzuki, T

    1976-01-01

    Gastrocnemius muscles taken from decapitated mice at various intervals after death and from mice killed by 2,4-dinitrophenol or mono-iodoacetic acid injection to induce rigor mortis soon after death, were observed by electron microscopy. The prominent appearance of many fine cross striations in the myofibrils (occurring about every 400 A) was considered to be characteristic of rigor mortis. These striations were caused by minute granules studded along the surfaces of both thick and thin filaments and appeared to be the bridges connecting the 2 kinds of filaments and accounted for the hardness and rigidity of the muscle.

  19. The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System

    Science.gov (United States)

    Mixon, Jason; Stuart, Jerry

    2009-01-01

    In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…

  20. Quality of methodological reporting of randomized clinical trials of sodium-glucose cotransporter-2 (sglt2 inhibitors

    Directory of Open Access Journals (Sweden)

    Hadeel Alfahmi

    2017-01-01

    Full Text Available Sodium-glucose cotransporter-2 (SGLT2 inhibitors are a new class of medicines approved recently for the treatment of type 2 diabetes. To improve the quality of randomized clinical trial (RCT reports, the Consolidated Standards of Reporting Trials (CONSORT statement for methodological features was created. For achieving our objective in this study, we assessed the quality of methodological reporting of RCTs of SGLT2 inhibitors according to the 2010 CONSORT statement. We reviewed and analyzed the methodology of SGLT2 inhibitors RCTs that were approved by the Food & Drug Administration (FDA. Of the 27 trials, participants, eligibility criteria, and additional analyses were reported in 100% of the trials. In addition, trial design, interventions, and statistical methods were reported in 96.3% of the trials. Outcomes were reported in 93.6% of the trials. Settings were reported in 85.2% of the trials. Blinding and sample size were reported in 66.7 and 59.3% of the trials, respectively. Sequence allocation and the type of randomization were reported in 63 and 74.1% of the trials, respectively. Besides those, a few methodological items were inadequate in the trials. Allocation concealment was inadequate in most of the trials. It was reported only in 11.1% of the trials. The majority of RCTs have high percentage adherence for more than half of the methodological items of the 2010 CONSORT statement.

  1. Trends: Rigor Mortis in the Arts.

    Science.gov (United States)

    Blodget, Alden S.

    1991-01-01

    Outlines how past art education provided a refuge for students from the rigors of other academic subjects. Observes that in recent years art education has become "discipline based." Argues that art educators need to reaffirm their commitment to a humanistic way of knowing. (KM)

  2. Photoconductivity of amorphous silicon-rigorous modelling

    International Nuclear Information System (INIS)

    Brada, P.; Schauer, F.

    1991-01-01

    It is our great pleasure to express our gratitude to Prof. Grigorovici, the pioneer of the exciting field of amorphous state by our modest contribution to this area. In this paper are presented the outline of the rigorous modelling program of the steady-state photoconductivity in amorphous silicon and related materials. (Author)

  3. Development of a methodology for the analysis of the crystalline quality of single crystals

    International Nuclear Information System (INIS)

    Metairon, Sabrina

    1999-01-01

    This work aims to establish a methodology for the analysis of the crystalline quality of single crystals. It is shown in the work as from neutron diffraction tridimensional rocking curves it is possible to determine the intrinsic widths at half maximum of the crystalline domains of a crystal, as well as the relative intensities of such domains and the angular distances between them. The construction of contour maps, on the bases of the tridimensional curves, make easier the determination of the above characteristics. For the development of the method, tridimensional rocking curves (I x ω x χ) have been obtained with neutrons from a barium lithium fluoride (BaLiF 3 ) and an aluminum crystal. The intensity I was obtained as rocking curves around the ω axis, with the angle % varying in a convenient interval. The individual (I x ω) and (I x χ) curves, which constitute the tridimensional rocking curve, were fitted by Gaussians and, in continuation of the process, the instrumental broadenings in directions ω and χ were deconvoluted from them. The instrumental broadenings were obtained with perfect type lithium fluoride (LiF) single crystals in the form of rocking curves around the ω and χ axes. Due to an enhanced Lorentz factor in direction χ, the scale in this direction was 'shrunk' by a correction factor in order to make the widths at half maximum of domains equivalent to those found in direction co. The contour map constructed with the deconvoluted rocking curves for BaLiF 3 , showed the existence of a 'proximity effect' that occurs when the widths at half maximum of domains have values near the value of the instrumental broadening. The contour map constructed with the deconvoluted rocking curves for aluminum, showed five domains of the mosaic type. Such domains were characterized concerning the width at half maximum, relative intensity and distance between them. (author)

  4. Using Lean Six Sigma Methodology to Improve Quality of the Anesthesia Supply Chain in a Pediatric Hospital.

    Science.gov (United States)

    Roberts, Renée J; Wilson, Ashley E; Quezado, Zenaide

    2017-03-01

    Six Sigma and Lean methodologies are effective quality improvement tools in many health care settings. We applied the DMAIC methodology (define, measure, analyze, improve, control) to address deficiencies in our pediatric anesthesia supply chain. We defined supply chain problems by mapping existing processes and soliciting comments from those involved. We used daily distance walked by anesthesia technicians and number of callouts for missing supplies as measurements that we analyzed before and after implementing improvements (anesthesia cart redesign). We showed improvement in the metrics after those interventions were implemented, and those improvements were sustained and thus controlled 1 year after implementation.

  5. Social cognition interventions for people with schizophrenia: a systematic review focussing on methodological quality and intervention modality.

    Science.gov (United States)

    Grant, Nina; Lawrence, Megan; Preti, Antonio; Wykes, Til; Cella, Matteo

    2017-08-01

    People with a diagnosis of schizophrenia have significant social and functional difficulties. Social cognition was found to influences these outcomes and in recent years interventions targeting this domain were developed. This paper reviews the existing literature on social cognition interventions for people with a diagnosis of schizophrenia focussing on: i) comparing focussed (i.e. targeting only one social cognitive domain) and global interventions and ii) studies methodological quality. Systematic search was conducted on PubMed and PsycInfo. Studies were included if they were randomised control trials, participants had a diagnosis of schizophrenia or schizoaffective disorder, and the intervention targeted at least one out of four social cognition domains (i.e. theory of mind, affect recognition, social perception and attribution bias). All papers were assessed for methodological quality. Information on the intervention, control condition, study methodology and the main findings from each study were extracted and critically summarised. Data from 32 studies fulfilled the inclusion criteria, considering a total of 1440 participants. Taking part in social cognition interventions produced significant improvements in theory of mind and affect recognition compared to both passive and active control conditions. Results were less clear for social perception and attributional bias. Focussed and global interventions had similar results on outcomes. Overall study methodological quality was modest. There was very limited evidence showing that social cognitive intervention result in functional outcome improvement. The evidence considered suggests that social cognition interventions may be a valuable approach for people with a diagnosis of schizophrenia. However, evidence quality is limited by measure heterogeneity, modest study methodology and short follow-up periods. The findings point to a number of recommendations for future research, including measurement standardisation

  6. MODERN CONCEPTS OF THE SIX SIGMA METHODOLOGY FOR IMPROVING THE QUALITY

    Directory of Open Access Journals (Sweden)

    MARGARITA JANESKA

    2018-02-01

    Full Text Available Product quality is generally accepted as being crucial in today’s industrial business. The traditional aspects of product quality are connected to product design (translating customer demands into attractive features and technical specifications and to the design and specification of high performance production processes with low defect rates. Quality management is the general expression for all actions leading to quality. Quality management is focused on improving customer satisfaction through continuous improvement of processes including the removal of uncertain activities, and continuous improvement of the quality of processes, products and services. The quality management includes four key processes, such as quality planning, quality assurance, quality control and quality costs. The main accent in this paper will be on quality control and the application of one of the quality control tools in order to improve it. Six Sigma is different from other quality improvement concepts in that its framework is comprised of many principles, tools and techniques, which, together with experience, are all integrated and translated into best practices. Bearing in mind that the goal of every company is to work effectively and effectively in the long run, this paper focuses on Six Sigma as a way to continuously improve quality. Namely, this paper emphasizes the key features of the quality of products / services, the Need for the application of Six Sigma for quality assurance, and also a detailed list of tools and techniques that can be used during the implementation of Six Sigma.

  7. Uso de dados de avaliação para escolha de escolas para um Survey: desafios para a imaginação e o rigor metodológico Uso de datos de evaluación para selección de escuelas para un survey: desafíos para la imaginación y el rigor metodológico The use of evaluation data to select schools for a Survey: challenges to the imagination and methodological rigorousness

    Directory of Open Access Journals (Sweden)

    Cynthia Paes de Carvalho

    2011-03-01

    perspectiva es articular informaciones tanto macro como micro de modo tal a que se pueda viabilizar un análisis relacional de los datos que serán recolectados y poder contribuir, así, para la discusión metodológica en el campo de la sociología de la educación.The text describes and discusses the selection process of schools for a research, which used the results of the external evaluations of the Brazilian school system from 2005, 2006 and 2007. Aiming to deepen the study of the organizations and institutional practices that produce schooling success, we decided to develop a survey at eight schools: four public administrated by the municipality, and four private ones. The research focuses the pedagogical practices and management processes at those schools of the municipality of Rio de Janeiro that are recognized through the excellence of their teaching, considering the results their students achieve in public exams that evaluate their performance. Therefore, we analyzed the results of the various schools, turning the selection process itself a stage of the research. Thereby, we intend to articulate macro and micro information in order to make feasible a relational analysis of the data that shall be collected and contribute to the methodological discussion in the field of sociology of education.

  8. Quality of reporting and of methodology of studies on interventions for trophic ulcers in leprosy: A systematic review

    Directory of Open Access Journals (Sweden)

    Forsetlund L

    2008-01-01

    Full Text Available Background: In the process of conducting a systematic review on interventions for skin lesions due to neuritis in leprosy, we assessed several primary papers with respect to the quality of reporting and methods used in the studies. Awareness of what constitutes weak points in previously conducted studies may be used to improve the planning, conducting and reporting of future clinical trials. Aims: To assess the quality of reporting and of methodology in studies of interventions for skin lesions due to neuritis in leprosy. Methods: Items of importance for preventing selection bias, detection bias, attrition bias and performance bias were among items assessed. The items for assessing methodological quality were used as a basis for making the checklist to assess the quality of reporting. Results: Out of the 854 references that we inspected eight studies were included on the basis of the inclusion criteria. The interventions tested were dressings, topical agents and footwear and in all studies healing of ulcers was the main outcome measure. Reporting of both, methods and results suffered from underreporting and disorganization. The most under-reported items were concealment of allocation, blinding of patients and outcome assessors, intention to treat and validation of outcomes. Conclusion: There is an apparent need to improve the methodological quality as well as the quality of reporting of trials in leprosy ulcer treatment. The most important threat in existing studies is the threat of selection bias. For the reporting of future studies, journals could promote and encourage the use of the CONSORT statement checklist by expecting and requiring that authors adhere to it in their reporting.

  9. Accelerating Biomedical Discoveries through Rigor and Transparency.

    Science.gov (United States)

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  10. Methodologic quality of meta-analyses and systematic reviews on the Mediterranean diet and cardiovascular disease outcomes: a review.

    Science.gov (United States)

    Huedo-Medina, Tania B; Garcia, Marissa; Bihuniak, Jessica D; Kenny, Anne; Kerstetter, Jane

    2016-03-01

    Several systematic reviews/meta-analyses published within the past 10 y have examined the associations of Mediterranean-style diets (MedSDs) on cardiovascular disease (CVD) risk. However, these reviews have not been evaluated for satisfying contemporary methodologic quality standards. This study evaluated the quality of recent systematic reviews/meta-analyses on MedSD and CVD risk outcomes by using an established methodologic quality scale. The relation between review quality and impact per publication value of the journal in which the article had been published was also evaluated. To assess compliance with current standards, we applied a modified version of the Assessment of Multiple Systematic Reviews (AMSTARMedSD) quality scale to systematic reviews/meta-analyses retrieved from electronic databases that had met our selection criteria: 1) used systematic or meta-analytic procedures to review the literature, 2) examined MedSD trials, and 3) had MedSD interventions independently or combined with other interventions. Reviews completely satisfied from 8% to 75% of the AMSTARMedSD items (mean ± SD: 31.2% ± 19.4%), with those published in higher-impact journals having greater quality scores. At a minimum, 60% of the 24 reviews did not disclose full search details or apply appropriate statistical methods to combine study findings. Only 5 of the reviews included participant or study characteristics in their analyses, and none evaluated MedSD diet characteristics. These data suggest that current meta-analyses/systematic reviews evaluating the effect of MedSD on CVD risk do not fully comply with contemporary methodologic quality standards. As a result, there are more research questions to answer to enhance our understanding of how MedSD affects CVD risk or how these effects may be modified by the participant or MedSD characteristics. To clarify the associations between MedSD and CVD risk, future meta-analyses and systematic reviews should not only follow methodologic

  11. Metodologias de controle de qualidade de análises granulométricas do solo Methodologies for quality control of soil particle size analysis

    Directory of Open Access Journals (Sweden)

    Vilson Antonio Klein

    2013-05-01

    Full Text Available A qualidade da análise granulométrica é imprescindível para a adequada classificação textural dos solos e enquadramento no Zoneamento Agroclimático. O objetivo deste trabalho foi avaliar metodologias de controle de qualidade das análises granulométricas. Foram coletadas amostras de solos com diferentes texturas, as quais foram analisadas por cinco laboratórios. O controle da qualidade foi realizado utilizando duas metodologias, uma proposta pelo Instituto Agronômico de Campinas (IAC e a outra pela Rede Oficial de Laboratórios de Análise de Solo e de Tecido Vegetal dos Estados do RS e SC (ROLAS. Verificou-se maior número de inconformidades para as frações argila e silte, independentemente do método, sendo que o método da ROLAS apresentou maior número de inconformidades (76% das amostras, principalmente para amostras com teores mais elevados dessas frações. A metodologia da ROLAS com utilização da mediana é mais rigorosa no controle da qualidade das análises.The quality of particle size analysis is essential for correct soil textural classification and defining the Brazilian Agroclimatic Zoning. The objectives were to evaluate methods of particle size analysis and develop a spreadsheet for use in quality control analyzes. We collected 50 soil samples with different particle size distribution, in RS and SC states of Brazil, which were analyzed by five laboratories in the two states that perform the soil particle size analysis using two methods: one proposed by the Campinas Agronomic Institute (IAC and the other by the Network of Soil and Plant Tissue Analysis Laboratories of the states of RS and SC (ROLAS. A greater number of disparities were observed for the clay and sand fractions regardless of the method. The ROLAS method had a higher number of mismatches (76% of samples, especially for samples with higher contents of those two fractions. The use of the median value in the ROLAS methodology is more rigorous in analysis

  12. Feasibility study and methodology to create a quality-evaluated database of primary care data

    Directory of Open Access Journals (Sweden)

    Alison Bourke

    2004-11-01

    Conclusions In the group of practices studied, levels of recording were generally assessed to be of sufficient quality to enable a database of quality-evaluated, anonymised primary care records to be created.

  13. On the quality of global emission inventories. Approaches, methodologies, input data and uncertainties

    International Nuclear Information System (INIS)

    Olivier, J.G.J.

    2002-01-01

    Four key scientific questions will be investigated: (1) How does a user define the 'quality' of a global (or national) emission inventory? (Chapter 2); (2) What determines the quality of a global emission inventory? (Chapters 2 and 7); (3) How can inventory quality be achieved in practice and expressed in quantitative terms ('uncertainty')? (Chapters 3 to 6); and (4) What is the preferred approach for compiling a global emission inventory, given the practical limitations and the desired inventory quality? (Chapters 7 and 8)

  14. A staffing decision support methodology using a quality loss function : a cross-disciplinary quantitative study

    NARCIS (Netherlands)

    Mincsovics, G.Z.

    2009-01-01

    Background Understanding the quality loss implications of short staffing is essential in maintaining service quality on a limited budget. Objectives For elaborate financial control on staffing decisions, it is necessary to quantify the cost of the incidental quality loss that a given workload and

  15. Some problems in methodology of economjc evaluation of radiation technique quality

    International Nuclear Information System (INIS)

    Kodyukov, V.M.; Purtova, M.I.; Sokolova, Z.I.; Smirnova, Z.M.

    1976-01-01

    The quality of radiation equipment (RE) should essentially be assessed when designing, standardizing, planning, and evaluating the cost and economy of RE. The basic factors are sited upon which subsequent economic assessment of quality levels were based. It also discusses the specifics involved in determining the principal quality factors for radioisotopic flaw-detection equipment and gamma-therapeutic instruments

  16. Quality Adjusted Life Years and Trade Off Exercises : exploring methodology and validity

    NARCIS (Netherlands)

    Verschuuren, Marieke

    2006-01-01

    Quality Adjusted Life Years (QALYs) are a popular outcome measure in cost-effectiveness analyses. QALYs are computed by multiplying follow-up or survival by a scaling factor reflecting health related quality of life, and as such capture quantity and quality gains simultaneously. Issues with regard

  17. Randomized clinical trials in dentistry: Risks of bias, risks of random errors, reporting quality, and methodologic quality over the years 1955-2013.

    Directory of Open Access Journals (Sweden)

    Humam Saltaji

    Full Text Available To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time.We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics.Sequence generation was assessed to be inadequate (at unclear or high risk of bias in 68% (n = 367 of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%. Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154 and 40.5% (n = 219 of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427 of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95 of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%, while the method of blinding was appropriate in 53% (n = 286 of the trials. We identified a significant decrease over time (1955-2013 in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05 in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias.The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent

  18. Mobile and Web 2.0 interventions for weight management: an overview of review evidence and its methodological quality.

    Science.gov (United States)

    Bardus, Marco; Smith, Jane R; Samaha, Laya; Abraham, Charles

    2016-08-01

    The use of Internet and related technologies for promoting weight management (WM), physical activity (PA), or dietary-related behaviours has been examined in many articles and systematic reviews. This overview aims to summarize and assess the quality of the review evidence specifically focusing on mobile and Web 2.0 technologies, which are the most utilized, currently available technologies. Following a registered protocol (CRD42014010323), we searched 16 databases for articles published in English until 31 December 2014 discussing the use of either mobile or Web 2.0 technologies to promote WM or related behaviors, i.e. diet and physical activity (PA). Two reviewers independently selected reviews and assessed their methodological quality using the AMSTAR checklist. Citation matrices were used to determine the overlap among reviews. Forty-four eligible reviews were identified, 39 of which evaluated the effects of interventions using mobile or Web 2.0 technologies. Methodological quality was generally low with only 7 reviews (16%) meeting the highest standards. Suggestive evidence exists for positive effects of mobile technologies on weight-related outcomes and, to a lesser extent, PA. Evidence is inconclusive regarding Web 2.0 technologies. Reviews on mobile and Web 2.0 interventions for WM and related behaviors suggest that these technologies can, under certain circumstances, be effective, but conclusions are limited by poor review quality based on a heterogeneous evidence base. © The Author 2016. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  19. Applying the Tropos Methodology for Analysing Web Services Requirements and Reasoning about Qualities of Services

    NARCIS (Netherlands)

    Aiello, Marco; Giorgini, Paolo

    2004-01-01

    The shift in software engineering from the design, implementation and management of isolated software elements towards a network of autonomous interoperable service is calling for a shift in the way software is designed. We propose the use of the agent-oriented methodology Tropos for the analysis of

  20. A Comparison of the Methodological Quality of Articles in Computer Science Education Journals and Conference Proceedings

    Science.gov (United States)

    Randolph, Justus J.; Julnes, George; Bednarik, Roman; Sutinen, Erkki

    2007-01-01

    In this study we empirically investigate the claim that articles published in computer science education journals are more methodologically sound than articles published in computer science education conference proceedings. A random sample of 352 articles was selected from those articles published in major computer science education forums between…

  1. From everyday communicative figurations to rigorous audience news repertoires

    DEFF Research Database (Denmark)

    Kobbernagel, Christian; Schrøder, Kim Christian

    2016-01-01

    In the last couple of decades there has been an unprecedented explosion of news media platforms and formats, as a succession of digital and social media have joined the ranks of legacy media. We live in a ‘hybrid media system’ (Chadwick, 2013), in which people build their cross-media news...... repertoires from the ensemble of old and new media available. This article presents an innovative mixed-method approach with considerable explanatory power to the exploration of patterns of news media consumption. This approach tailors Q-methodology in the direction of a qualitative study of news consumption......, in which a card sorting exercise serves to translate the participants’ news media preferences into a form that enables the researcher to undertake a rigorous factor-analytical construction of their news consumption repertoires. This interpretive, factor-analytical procedure, which results in the building...

  2. Exploration of the methodological quality and clinical usefulness of a cross-sectional sample of published guidance about exercise training and physical activity for the secondary prevention of coronary heart disease.

    Science.gov (United States)

    Abell, Bridget; Glasziou, Paul; Hoffmann, Tammy

    2017-06-13

    guidance types (mean score 33%). While a large number of guidance documents provide recommendations for exercise-based cardiac rehabilitation, most have limitations in either methodological quality or clinical usefulness. The lack of rigorously developed guidelines which also contain necessary detail about exercise training remains a substantial problem for clinicians.

  3. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  4. [Methodological quality of articles on therapeutic procedures published in Cirugía Española. Evaluation of the period 2005-2008].

    Science.gov (United States)

    Manterola, Carlos; Grande, Luís

    2010-04-01

    To determine methodological quality of therapy articles published in Cirugía Española and to study its association with the publication year, the centre of origin and subjects. A literature study which included all therapy articles published between 2005 and 2008. All kinds of clinical designs were considered, excluding editorials, review articles, letters to editor and experimental studies. Variables analysed included: year of publication, centre of origin, design, and methodological quality of articles. A valid and reliable scale was applied to determine methodological quality. A total of 243 articles [206 series of cases (84.8%), 27 cohort studies (11.1%), 9 clinical trials (3.7%) and 1 case control study (0.4%)] were found. Studies came preferentially from Catalonia and Valencia (22.3% and 12.3% respectively). Thematic areas most frequently found were hepato-bilio-pancreatic and colorectal surgery (20.0% and 16.6%, respectively). Average and median of the methodological quality score calculated for the entire series were 9.5+/-4.3 points and 8 points, respectively. Association between methodological quality and geographical area (p=0.0101), subject area (p=0.0267), and university origin (p=0.0369) was found. A significant increase of methodological quality by publication year was observed (p=0.0004). Methodological quality of therapy articles published in Cirugía Española between 2005 and 2008 is low; but an increase tendency with statistical significance was observed.

  5. Can formal collaborative methodologies improve quality in primary health care in New Zealand? Insights from the EQUIPPED Auckland Collaborative.

    Science.gov (United States)

    Palmer, Celia; Bycroft, Janine; Healey, Kate; Field, Adrian; Ghafel, Mazin

    2012-12-01

    Auckland District Health Board was one of four District Health Boards to trial the Breakthrough Series (BTS) methodology to improve the management of long-term conditions in New Zealand, with support from the Ministry of Health. To improve clinical outcomes, facilitate planned care and promote quality improvement within participating practices in Auckland. Implementation of the Collaborative followed the improvement model / Institute for Healthcare Improvement methodology. Three topic areas were selected: system redesign, cardio-vascular disease/diabetes, and self-management support. An expert advisory group and the Improvement Foundation Australia helped guide project development and implementation. Primary Health Organisation facilitators were trained in the methodology and 15 practice teams participated in the three learning workshops and action periods over 12 months. An independent evaluation study using both quantitative and qualitative methods was conducted. Improvements were recorded in cardiovascular disease risk assessment, practice-level systems of care, self-management systems and follow-up and coordination for patients. Qualitative research found improvements in coordination and teamwork, knowledge of practice populations and understanding of managing long-term conditions. The Collaborative process delivered some real improvements in the systems of care for people with long-term conditions and a change in culture among participating practices. The findings suggest that by strengthening facilitation processes, improving access to comprehensive population audit tools and lengthening the time frame, the process has the potential to make significant improvements in practice. Other organisations should consider this approach when investigating quality improvement programmes.

  6. Development of rigor mortis is not affected by muscle volume.

    Science.gov (United States)

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  7. Maps of sharpness: a methodology to present results of quality control for mammographic system

    International Nuclear Information System (INIS)

    Oliveira, Henrique Jesus Quintino de; Marques, Marcio Alexandre; Frere, Annie France; Schiable, Homero; Marques, Paulo M. Azevedo; Irita, Ricardo Toshiyuki

    1996-01-01

    A new method for evaluating radiologic imaging systems quality is presented. This method intends to relate the numeric results from quality control procedures to the magnitude of shadow and penumbra in the image from given objects. This evaluation is based on a computer simulation and it can be performed for any system and any object placed in any location of the radiation field

  8. Outcomes of Quality Assurance: A Discussion of Knowledge, Methodology and Validity

    Science.gov (United States)

    Stensaker, Bjorn

    2008-01-01

    A common characteristic in many quality assurance schemes around the world is their implicit and often narrowly formulated understanding of how organisational change is to take place as a result of the process. By identifying some of the underlying assumptions related to organisational change in current quality assurance schemes, the aim of this…

  9. Quality benchmarking methodology: Case study of finance and culture industries in Latvia

    Directory of Open Access Journals (Sweden)

    Ieva Zemīte

    2011-01-01

    Full Text Available Political, socio-economic and cultural changes that have taken place in the world during the last years have influenced all the spheres. Constant improvements are necessary to sustain in rival and shrinking markets. This sets high quality standards for the service industries. Therefore it is important to conduct comparison of quality criteria to ascertain which practices are achieving superior performance levels. At present companies in Latvia do not carry out mutual benchmarking, and as a result of that do not know how they rank against their peers in terms of quality, as well as they do not see benefits in sharing of information and in benchmarking.The purpose of this paper is to determine the criteria of qualitative benchmarking, and to investigate the use of the benchmarking quality in service industries, particularly: finance and culture sectors in Latvia in order to determine the key driving factors of quality, to explore internal and foreign benchmarks, and to reveal the full potential of inputs’ reduction and efficiency growth for the aforementioned industries.Case study and other tools are used to define the readiness of the company for benchmarking. Certain key factors are examined for their impact on quality criteria. The results are based on the research conducted in professional associations in defined fields (insurance and theatre.Originality/value – this is the first study that adopts the benchmarking models for measuring quality criteria and readiness for mutual comparison in insurance and theatre industries in Latvia.

  10. Application of Six Sigma Using DMAIC Methodology in the Process of Product Quality Control in Metallurgical Operation

    Directory of Open Access Journals (Sweden)

    Girmanová Lenka

    2017-12-01

    Full Text Available The Six Sigma DMAIC can be considered a guide for problem solving and product or process improvement. The majority of companies start to implement Six Sigma using the DMAIC methodology. The paper deals with application of Six Sigma using the DMAIC methodology in the process of product quality control. The case study is oriented on the field of metallurgical operations. The goal of the Six Sigma project was to ensure the required metallurgic product quality and to avoid an increase in internal costs associated with poor product quality. In this case study, a variety of tools and techniques like flow chart, histogram, Pareto diagram, analysis of FMEA (Failure Mode and Effect Analysis data, cause and effect diagram, logical analysis was used. The Sigma level has improved by approximately 13%. The achieved improvements have helped to reduce the quantity of defective products and the processing costs (technology for re-adjusting. Benefits resulting from the DMAIC implementation can be divided into three levels: the qualitative, economic and safety level.

  11. A Draft Conceptual Framework of Relevant Theories to Inform Future Rigorous Research on Student Service-Learning Outcomes

    Science.gov (United States)

    Whitley, Meredith A.

    2014-01-01

    While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…

  12. Quality Control in Screening for Infectious Diseases at Blood Banks. Rationale and Methodology.

    Science.gov (United States)

    Sáez-Alquezar, Amadeo; Albajar-Viñas, Pedro; Guimarães, André Valpassos; Corrêa, José Abol

    2015-11-01

    Quality control procedures are indispensable to ensure the reliability of the results provided by laboratories responsible for serological screening in blood banks. International recommendations on systems of quality management classify as a top component the inclusion of two types of control: (a) internal quality control (IQC) and (b) external quality control (EQC). In EQC it is essential to have, at least, a monthly frequency of laboratory assessment. On the other hand, IQC involves the daily use of low-reactivity control sera, which should be systematically added in all run, carried out in the laboratory for each parameter. Through the IQC analysis some variations in the criteria of run acceptance and rejection may be revealed, but it is of paramount importance to ensure the previous definition of these criteria and even more importantly, the adherence to them; and that corresponds to the validation of analytical runs of each test. Since 2010 this has been, for instance, the experience of the PNCQ*, developing external quality control programmes on serology for blood banks. These programmes use samples of lyophilized sera well-characterized for the reactivity related to the parameters used for the serological screening of blood donors. The programmes have used blind panels of six samples for monthly assessments. In the last 50 assessments, which involved 68 blood banks in Brazil, a significant number of instances of non-compliance were observed in all monthly assessments. These results provide strong support to the recommendation of systematic monthly assessments. (*) National Quality Control Programme (PNCQ).

  13. Statistics for mathematicians a rigorous first course

    CERN Document Server

    Panaretos, Victor M

    2016-01-01

    This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.

  14. Interdepartmental interaction model on the extracurricular activities of students in the city of Surgut in the quality management system of the municipal state institution "Information and Methodological Center"

    OpenAIRE

    Loseva E. A.

    2018-01-01

    in this article the author considers interdepartmental interaction model in the field of extracurricular activities of students in the quality management system. The topic is examined on the example of the municipal state institution "Information and Methodological Center".

  15. Application of kaizen methodology to foster departmental engagement in quality improvement.

    Science.gov (United States)

    Knechtges, Paul; Decker, Michael Christopher

    2014-12-01

    The Toyota Production System, also known as Lean, is a structured approach to continuous quality improvement that has been developed over the past 50 years to transform the automotive manufacturing process. In recent years, these techniques have been successfully applied to quality and safety improvement in the medical field. One of these techniques is kaizen, which is the Japanese word for "good change." The central tenant of kaizen is the quick analysis of the small, manageable components of a problem and the rapid implementation of a solution with ongoing, real-time reassessment. Kaizen adds an additional "human element" that all stakeholders, not just management, must be involved in such change. Because of the small size of the changes involved in a kaizen event and the inherent focus on human factors and change management, a kaizen event can serve as good introduction to continuous quality improvement for a radiology department. Copyright © 2014. Published by Elsevier Inc.

  16. [Quality of life - methodology and clinical practice aspects with a focus on ocular medicine].

    Science.gov (United States)

    Franke, G H; Gall, C

    2008-08-01

    Due to the demographic development in western industrialised countries, the proportion of visually impaired persons is likely to increase in the future. Currently there is a shift in scientific recognition from relative neglect of psychopathological distress in the visually impaired to better notice of disease-related subjective impairments that are detectable with specific questionnaire measures. Visual acuity primarily determines the subjective rating of visual functioning independent from the eye disease. Ophthalmic patients who show only mild symptoms from a medical point of view normally suffer considerably diminished vision-related quality of life with respect to physical, functional, mental, and social aspects. Treatment effects have been shown using vision-related quality-of-life measures for different ophthalmic diseases, particularly cataract surgery. Assessment of vision-related quality of life provides a meaningful complement to objective data.

  17. The Challenge of Timely, Responsive and Rigorous Ethics Review of Disaster Research: Views of Research Ethics Committee Members.

    Directory of Open Access Journals (Sweden)

    Matthew Hunt

    Full Text Available Research conducted following natural disasters such as earthquakes, floods or hurricanes is crucial for improving relief interventions. Such research, however, poses ethical, methodological and logistical challenges for researchers. Oversight of disaster research also poses challenges for research ethics committees (RECs, in part due to the rapid turnaround needed to initiate research after a disaster. Currently, there is limited knowledge available about how RECs respond to and appraise disaster research. To address this knowledge gap, we investigated the experiences of REC members who had reviewed disaster research conducted in low- or middle-income countries.We used interpretive description methodology and conducted in-depth interviews with 15 respondents. Respondents were chairs, members, advisors, or coordinators from 13 RECs, including RECs affiliated with universities, governments, international organizations, a for-profit REC, and an ad hoc committee established during a disaster. Interviews were analyzed inductively using constant comparative techniques.Through this process, three elements were identified as characterizing effective and high-quality review: timeliness, responsiveness and rigorousness. To ensure timeliness, many RECs rely on adaptations of review procedures for urgent protocols. Respondents emphasized that responsive review requires awareness of and sensitivity to the particularities of disaster settings and disaster research. Rigorous review was linked with providing careful assessment of ethical considerations related to the research, as well as ensuring independence of the review process.Both the frequency of disasters and the conduct of disaster research are on the rise. Ensuring effective and high quality review of disaster research is crucial, yet challenges, including time pressures for urgent protocols, exist for achieving this goal. Adapting standard REC procedures may be necessary. However, steps should be

  18. Patient-reported Outcomes in Randomised Controlled Trials of Prostate Cancer: Methodological Quality and Impact on Clinical Decision Making

    Science.gov (United States)

    Efficace, Fabio; Feuerstein, Michael; Fayers, Peter; Cafaro, Valentina; Eastham, James; Pusic, Andrea; Blazeby, Jane

    2014-01-01

    Context Patient-reported outcomes (PRO) data from randomised controlled trials (RCTs) are increasingly used to inform patient-centred care as well as clinical and health policy decisions. Objective The main objective of this study was to investigate the methodological quality of PRO assessment in RCTs of prostate cancer (PCa) and to estimate the likely impact of these studies on clinical decision making. Evidence acquisition A systematic literature search of studies was undertaken on main electronic databases to retrieve articles published between January 2004 and March 2012. RCTs were evaluated on a predetermined extraction form, including (1) basic trial demographics and clinical and PRO characteristics; (2) level of PRO reporting based on the recently published recommendations by the International Society for Quality of Life Research; and (3) bias, assessed using the Cochrane Risk of Bias tool. Studies were systematically analysed to evaluate their relevance for supporting clinical decision making. Evidence synthesis Sixty-five RCTs enrolling a total of 22 071 patients were evaluated, with 31 (48%) in patients with nonmetastatic disease. When a PRO difference between treatments was found, it related in most cases to symptoms only (n = 29, 58%). Although the extent of missing data was generally documented (72% of RCTs), few reported details on statistical handling of this data (18%) and reasons for dropout (35%). Improvements in key methodological aspects over time were found. Thirteen (20%) RCTs were judged as likely to be robust in informing clinical decision making. Higher-quality PRO studies were generally associated with those RCTs that had higher internal validity. Conclusions Including PRO in RCTs of PCa patients is critical for better evaluating the treatment effectiveness of new therapeutic approaches. Marked improvements in PRO quality reporting over time were found, and it is estimated that at least one-fifth of PRO RCTs have provided sufficient

  19. New methodology to implement quality control programs in medical imaging departments

    International Nuclear Information System (INIS)

    Furquim, Tania A.C.; Yanikian, Denise; Costa, Paulo R.

    1996-01-01

    The implementation of quality control programmes is studied in order to assure a better performance in medical imaging departments. The necessity of a continuous training of all technicians involved is highlighted. The contribution of these professionals is emphasized as fundamental to the success of the project

  20. Quality approach in in vivo nuclear medicine - Certification V2010 - Methodological guide

    International Nuclear Information System (INIS)

    Abdelmoumene, Nafissa; Ferreol, Dominique; Blondet, Emmanuelle; Bonardel, Gerald; Bourrel, Francois; Broglia, Jean Marc; Guilabert, Nadine; Israel, Jean-Marc; Machacek, Catherine; Martineau, Antoine; Remy, Herve; Rousseliere, Francis; Abelmann, Caroline

    2013-01-01

    This document first presents the different components of the activity in in-vivo nuclear medicine: techniques (functional imagery, vectorized internal radiotherapy, cases outside the nuclear medicine department), team composition and missions, radiation protection regulations, benefits and risks. Then, it addresses the quality approach: quality management system defined according to a process-oriented approach, documentation. It proposes a sheet to assess the implementation of the quality approach. This sheet contains 129 criteria which are related to management (strategy, activity steering and coordination), to support functions (management of human resources and abilities, management of radioactive sources and wastes, radio-pharmacy within the nuclear medicine department, management of medical devices, information system), to patient taking on (management of appointments and patient identification, imagery examination justification, patient reception, patients presenting risks and peculiar situations, checking before radio-pharmaceutical drug administering, taking on for diagnosis purpose, taking for therapeutic purposes), and to assessment, analysis and improvement (management of undesirable events associated with cares, quality follow-up for continuous improvement)

  1. Quality evaluation of health information system's architectures developed using the HIS-DF methodology.

    Science.gov (United States)

    López, Diego M; Blobel, Bernd; Gonzalez, Carolina

    2010-01-01

    Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.

  2. Interrelationships between man, energy, and water quality: a new methodology for integrative analyses

    International Nuclear Information System (INIS)

    Kaplan, E.; Thode, H.C. Jr.

    1979-01-01

    The STORET/MSP option was used to obtain county aggregated information on ambient water quality for sixty parameters during the period 1950 to 1978. Masks, extended EXTRACT specifications and bounds on allowable values limited inclusion of erroneous data. Remark codes were required to aggregate STORET parameters to obtain increased numbers of observations. Numerous statistical analyses led to the conclusions that medians were more useful than means, that trimming on number of observations was required to eliminate counties with extreme values, and that many parameters required logarithmic transformation to be useful in regional analyses. County aggregated data for nineteen water quality parameters were examined in terms of their ability to describe qualitative chemical characteristics of water. Anion--cation balances as well as expected relationships between conductivity and other parameters were correctly accounted for. Factor analysis indicated the existence of three principal components describing patterns between metal ions, non-metal ions, and alkalinity-bicarbonate, respectively. These factors were used in place of the original complete set of water quality parameters in a structural equation approach describing relationships between variables of mans activites. It was found that counties with high industrial electric consumption, farming and mineral shipments tended to have increased levels of most water quality parameters. It was also found that simpler path diagrams may be indicated to reduce the effects of redundancy in adequately describing energy--water relationships

  3. Control Charts in Healthcare Quality Improvement A Systematic Review on Adherence to Methodological Criteria

    NARCIS (Netherlands)

    Koetsier, A.; van der Veer, S. N.; Jager, K. J.; Peek, N.; de Keizer, N. F.

    2012-01-01

    Objectives: Use of Shewhart control charts in quality improvement (QI) initiatives is increasing. These charts are typically used in one or more phases of the Plan Do Study Act (PDSA) cycle to monitor summaries of process and outcome data, abstracted from clinical information systems, over time. We

  4. An evaluation of contaminated estuarine sites using sediment quality guidelines and ecological assessment methodologies.

    Science.gov (United States)

    Fulton, M; Key, P; Wirth, E; Leight, A K; Daugomah, J; Bearden, D; Sivertsen, S; Scott, G

    2006-10-01

    Toxic contaminants may enter estuarine ecosystems through a variety of pathways. When sediment contaminant levels become sufficiently high, they may impact resident biota. One approach to predict sediment-associated toxicity in estuarine ecosystems involves the use of sediment quality guidelines (ERMs, ERLs) and site-specific contaminant chemistry while a second approach utilizes site-specific ecological sampling to assess impacts at the population or community level. The goal of this study was to utilize an integrated approach including chemical contaminant analysis, sediment quality guidelines and grass shrimp population monitoring to evaluate the impact of contaminants from industrial sources. Three impacted sites and one reference site were selected for study. Grass shrimp populations were sampled using a push-netting approach. Sediment samples were collected at each site and analyzed for metals, polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs) and pesticides. Contaminant levels were then compared to sediment quality guidelines. In general, grass shrimp population densities at the sites decreased as the ERM quotients increased. Grass shrimp densities were significantly reduced at the impacted site that had an ERM exceedance for chromium and the highest Mean ERM quotient. Regression analysis indicated that sediment chromium concentrations were negatively correlated with grass shrimp density. Grass shrimp size was reduced at two sites with intermediate levels of contamination. These findings support the use of both sediment quality guidelines and site-specific population monitoring to evaluate the impacts of sediment-associated contaminants in estuarine systems.

  5. Combining Statistical Methodologies in Water Quality Monitoring in a Hydrological Basin - Space and Time Approaches

    OpenAIRE

    Costa, Marco; A. Manuela Gonçalves

    2012-01-01

    In this work are discussed some statistical approaches that combine multivariate statistical techniques and time series analysis in order to describe and model spatial patterns and temporal evolution by observing hydrological series of water quality variables recorded in time and space. These approaches are illustrated with a data set collected in the River Ave hydrological basin located in the Northwest region of Portugal.

  6. Data fusion methodologies for food and beverage authentication and quality assessment – A review

    International Nuclear Information System (INIS)

    Borràs, Eva; Ferré, Joan; Boqué, Ricard; Mestres, Montserrat; Aceña, Laura; Busto, Olga

    2015-01-01

    The ever increasing interest of consumers for safety, authenticity and quality of food commodities has driven the attention towards the analytical techniques used for analyzing these commodities. In recent years, rapid and reliable sensor, spectroscopic and chromatographic techniques have emerged that, together with multivariate and multiway chemometrics, have improved the whole control process by reducing the time of analysis and providing more informative results. In this progression of more and better information, the combination (fusion) of outputs of different instrumental techniques has emerged as a means for increasing the reliability of classification or prediction of foodstuff specifications as compared to using a single analytical technique. Although promising results have been obtained in food and beverage authentication and quality assessment, the combination of data from several techniques is not straightforward and represents an important challenge for chemometricians. This review provides a general overview of data fusion strategies that have been used in the field of food and beverage authentication and quality assessment. - Highlights: • Multivariate data fusion is used in food authentication and quality assessment. • Data fusion approaches and their applications are reviewed. • Data preprocessing, variable selection and feature extraction are considered. • Model selection and validation are also considered.

  7. Data fusion methodologies for food and beverage authentication and quality assessment – A review

    Energy Technology Data Exchange (ETDEWEB)

    Borràs, Eva [iSens Group, Department of Analytical Chemistry and Organic Chemistry, Universitat Rovira i Virgili, Campus Sescelades, 43007 Tarragona (Spain); Ferré, Joan, E-mail: joan.ferre@urv.cat [Chemometrics, Qualimetrics and Nanosensors Group, Department of Analytical Chemistry and Organic Chemistry, Universitat Rovira i Virgili, Campus Sescelades, 43007 Tarragona (Spain); Boqué, Ricard [Chemometrics, Qualimetrics and Nanosensors Group, Department of Analytical Chemistry and Organic Chemistry, Universitat Rovira i Virgili, Campus Sescelades, 43007 Tarragona (Spain); Mestres, Montserrat; Aceña, Laura; Busto, Olga [iSens Group, Department of Analytical Chemistry and Organic Chemistry, Universitat Rovira i Virgili, Campus Sescelades, 43007 Tarragona (Spain)

    2015-09-03

    The ever increasing interest of consumers for safety, authenticity and quality of food commodities has driven the attention towards the analytical techniques used for analyzing these commodities. In recent years, rapid and reliable sensor, spectroscopic and chromatographic techniques have emerged that, together with multivariate and multiway chemometrics, have improved the whole control process by reducing the time of analysis and providing more informative results. In this progression of more and better information, the combination (fusion) of outputs of different instrumental techniques has emerged as a means for increasing the reliability of classification or prediction of foodstuff specifications as compared to using a single analytical technique. Although promising results have been obtained in food and beverage authentication and quality assessment, the combination of data from several techniques is not straightforward and represents an important challenge for chemometricians. This review provides a general overview of data fusion strategies that have been used in the field of food and beverage authentication and quality assessment. - Highlights: • Multivariate data fusion is used in food authentication and quality assessment. • Data fusion approaches and their applications are reviewed. • Data preprocessing, variable selection and feature extraction are considered. • Model selection and validation are also considered.

  8. Improved Temperature Sounding and Quality Control Methodology Using AIRS/AMSU Data: The AIRS Science Team Version 5 Retrieval Algorithm

    Science.gov (United States)

    Susskind, Joel; Blaisdell, John M.; Iredell, Lena; Keita, Fricky

    2009-01-01

    This paper describes the AIRS Science Team Version 5 retrieval algorithm in terms of its three most significant improvements over the methodology used in the AIRS Science Team Version 4 retrieval algorithm. Improved physics in Version 5 allows for use of AIRS clear column radiances in the entire 4.3 micron CO2 absorption band in the retrieval of temperature profiles T(p) during both day and night. Tropospheric sounding 15 micron CO2 observations are now used primarily in the generation of clear column radiances .R(sub i) for all channels. This new approach allows for the generation of more accurate values of .R(sub i) and T(p) under most cloud conditions. Secondly, Version 5 contains a new methodology to provide accurate case-by-case error estimates for retrieved geophysical parameters and for channel-by-channel clear column radiances. Thresholds of these error estimates are used in a new approach for Quality Control. Finally, Version 5 also contains for the first time an approach to provide AIRS soundings in partially cloudy conditions that does not require use of any microwave data. This new AIRS Only sounding methodology, referred to as AIRS Version 5 AO, was developed as a backup to AIRS Version 5 should the AMSU-A instrument fail. Results are shown comparing the relative performance of the AIRS Version 4, Version 5, and Version 5 AO for the single day, January 25, 2003. The Goddard DISC is now generating and distributing products derived using the AIRS Science Team Version 5 retrieval algorithm. This paper also described the Quality Control flags contained in the DISC AIRS/AMSU retrieval products and their intended use for scientific research purposes.

  9. Using Indigenist and Indigenous methodologies to connect to deeper understandings of Aboriginal and Torres Strait Islander peoples' quality of life.

    Science.gov (United States)

    Kite, Elaine; Davy, Carol

    2015-12-01

    The lack of a common description makes measuring the concept of quality of life (QoL) a challenge. Whether QoL incorporates broader social features or is attributed to health conditions, the diverse range of descriptions applied by various disciplines has resulted in a concept that is multidimensional and vague. The variety of theoretical conceptualisations of QoL confounds and confuses even the most astute. Measuring QoL in Aboriginal and Torres Strait Islander populations is even more challenging. Instruments commonly developed and used to measure QoL are often derived from research methodologies shaped by Western cultural perspectives. Often they are simply translated for use among culturally and linguistically diverse Aboriginal and Torres Strait Islander peoples. This has implications for Aboriginal and Torres Strait Islander populations whose perceptions of health are derived from within their specific cultures, value systems and ways of knowing and being. Interconnections and relationships between themselves, their communities, their environment and the natural and spiritual worlds are complex. The way in which their QoL is currently measured indicates that very little attention is given to the diversity of Aboriginal and Torres Strait Islander peoples' beliefs or the ways in which those beliefs shape or give structure and meaning to their health and their lives. The use of Indigenist or Indigenous methodologies in defining what Aboriginal and Torres Strait Islander peoples believe gives quality to their lives is imperative. These methodologies have the potential to increase the congruency between their perceptions of QoL and instruments to measure it.

  10. Modelling the effects of transglutaminase and L-ascorbic acid on substandard quality wheat flour by response surface methodology

    Directory of Open Access Journals (Sweden)

    Šimurina Olivera D.

    2014-01-01

    Full Text Available In recent decade, there have been observed extreme variations in climatic conditions which in combination with inadequate agro techniques lead to decreased quality of mercantile wheat, actally flour. The application of improvers can optimise the quality of substandard wheat flour. This paper focuses to systematic analysis of individual and interaction effects of ascorbic acid and transglutaminase as dough strengthening improvers. The effects were investigated using the Response Surface Methodology. Transglutaminase had much higher linear effect on the rheological and fermentative properties of dough from substandard flour than L-ascorbic acid. Both transglutaminase and L-ascorbic acid additions had a significant linear effect on the increase of bread specific volume. Effects of transglutaminase and ascorbic acid are dependent on the applied concentrations and it is necessary to determine the optimal concentration in order to achieve the maximum quality of the dough and bread. Optimal levels of tested improvers were determined using appropriate statistical techniques which applied the desirability function. It was found that the combination of 30 mg/kg of transglutaminase and 75.8 mg/kg of L-ascorbic acid achieved positive synergistic effect on rheological and fermentative wheat dough properties, as well on textural properties and specific volume of bread made from substandard quality flour.

  11. Evaluation of chips quality by the analysis of two different harvesting methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Pari, L.; Civitarese, V.; Del Giudice, A. [Council for Research in Agriculture, Agricultural Engineering Research Unit, Rome (Italy)

    2010-07-01

    The Council for Research in Agriculture, Agricultural Engineering Research Unit (CRA-ING) in Rome, Italy has developed an innovative short-rotation forestry (SRF) harvesting system. The system involves a 2 step operation, notably the tree felling and inter-row windrowing performed by a feller windrower equipment; and subsequent chipping performed by a harvester equipped with a pick-up device. The low moisture content of the windrowed trees at harvesting time affects their physical qualities and mechanical strength throughout the chipping operation. The purpose of this study was to analyze the moisture losses of windrowed trees, in relation to the windrow location, on field storage and weather condition. In addition, the study characterized chips quality changes during on field storage by the 2 different harvesting systems. The innovative 2-step system was compared with the traditional 1-step harvesting system.

  12. Methodology for the construction of a physical phantom for quality control of images in digital radiography

    International Nuclear Information System (INIS)

    Santos, Tayline T.; Vieira, Jose Wilson; Oliveira, Alex Cristovao H. de; Lima, Fernando R. de Andrade

    2013-01-01

    The advancement of technology in recent years has provided the production of increasingly sophisticated devices, aiming to acquire medical images with high technical level and also facilitate the operational readiness of the equipment. In order to ensure the most accurate diagnosis with minimum dose without exposing patients to obtain data and verify the performance of a radiographic system for quality control purposes we use the so-called phantoms. Phantoms are physical or computational models used to simulate the transport of ionizing radiation, their interactions in the tissues of the human body and evaluate the deposition of energy. Besides, they are made from materials with behavior similar to human tissues when exposed to ionizing radiation - the so-called tissue-equivalent materials. This paper describes the construction of a physical phantom that allows the execution of the main acceptance tests of the quality control protocols in digital radiography

  13. Data fusion methodologies for food and beverage authentication and quality assessment - a review.

    Science.gov (United States)

    Borràs, Eva; Ferré, Joan; Boqué, Ricard; Mestres, Montserrat; Aceña, Laura; Busto, Olga

    2015-09-03

    The ever increasing interest of consumers for safety, authenticity and quality of food commodities has driven the attention towards the analytical techniques used for analyzing these commodities. In recent years, rapid and reliable sensor, spectroscopic and chromatographic techniques have emerged that, together with multivariate and multiway chemometrics, have improved the whole control process by reducing the time of analysis and providing more informative results. In this progression of more and better information, the combination (fusion) of outputs of different instrumental techniques has emerged as a means for increasing the reliability of classification or prediction of foodstuff specifications as compared to using a single analytical technique. Although promising results have been obtained in food and beverage authentication and quality assessment, the combination of data from several techniques is not straightforward and represents an important challenge for chemometricians. This review provides a general overview of data fusion strategies that have been used in the field of food and beverage authentication and quality assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Applying Quality Indicators to Single-Case Research Designs Used in Special Education: A Systematic Review

    Science.gov (United States)

    Moeller, Jeremy D.; Dattilo, John; Rusch, Frank

    2015-01-01

    This study examined how specific guidelines and heuristics have been used to identify methodological rigor associated with single-case research designs based on quality indicators developed by Horner et al. Specifically, this article describes how literature reviews have applied Horner et al.'s quality indicators and evidence-based criteria.…

  15. Teaching Mathematical Word Problem Solving: The Quality of Evidence for Strategy Instruction Priming the Problem Structure

    Science.gov (United States)

    Jitendra, Asha K.; Petersen-Brown, Shawna; Lein, Amy E.; Zaslofsky, Anne F.; Kunkel, Amy K.; Jung, Pyung-Gang; Egan, Andrea M.

    2015-01-01

    This study examined the quality of the research base related to strategy instruction priming the underlying mathematical problem structure for students with learning disabilities and those at risk for mathematics difficulties. We evaluated the quality of methodological rigor of 18 group research studies using the criteria proposed by Gersten et…

  16. Quality Appraisal of Single-Subject Experimental Designs: An Overview and Comparison of Different Appraisal Tools

    Science.gov (United States)

    Wendt, Oliver; Miller, Bridget

    2012-01-01

    Critical appraisal of the research literature is an essential step in informing and implementing evidence-based practice. Quality appraisal tools that assess the methodological quality of experimental studies provide a means to identify the most rigorous research suitable for evidence-based decision-making. In single-subject experimental research,…

  17. Association between prospective registration and overall reporting and methodological quality of systematic reviews: a meta-epidemiological study.

    Science.gov (United States)

    Ge, Long; Tian, Jin-Hui; Li, Ya-Nan; Pan, Jia-Xue; Li, Ge; Wei, Dang; Xing, Xin; Pan, Bei; Chen, Yao-Long; Song, Fu-Jian; Yang, Ke-Hu

    2018-01-01

    The aim of this study was to investigate the differences in main characteristics, reporting and methodological quality between prospectively registered and nonregistered systematic reviews. PubMed was searched to identify systematic reviews of randomized controlled trials published in 2015 in English. After title and abstract screening, potentially relevant reviews were divided into three groups: registered non-Cochrane reviews, Cochrane reviews, and nonregistered reviews. For each group, random number tables were generated in Microsoft Excel, and the first 50 eligible studies from each group were randomly selected. Data of interest from systematic reviews were extracted. Regression analyses were conducted to explore the association between total Revised Assessment of Multiple Systematic Review (R-AMSTAR) or Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) scores and the selected characteristics of systematic reviews. The conducting and reporting of literature search in registered reviews were superior to nonregistered reviews. Differences in 9 of the 11 R-AMSTAR items were statistically significant between registered and nonregistered reviews. The total R-AMSTAR score of registered reviews was higher than nonregistered reviews [mean difference (MD) = 4.82, 95% confidence interval (CI): 3.70, 5.94]. Sensitivity analysis by excluding the registration-related item presented similar result (MD = 4.34, 95% CI: 3.28, 5.40). Total PRISMA scores of registered reviews were significantly higher than nonregistered reviews (all reviews: MD = 1.47, 95% CI: 0.64-2.30; non-Cochrane reviews: MD = 1.49, 95% CI: 0.56-2.42). However, the difference in the total PRISMA score was no longer statistically significant after excluding the item related to registration (item 5). Regression analyses showed similar results. Prospective registration may at least indirectly improve the overall methodological quality of systematic reviews, although its impact

  18. Rigorous theory of molecular orientational nonlinear optics

    International Nuclear Information System (INIS)

    Kwak, Chong Hoon; Kim, Gun Yeup

    2015-01-01

    Classical statistical mechanics of the molecular optics theory proposed by Buckingham [A. D. Buckingham and J. A. Pople, Proc. Phys. Soc. A 68, 905 (1955)] has been extended to describe the field induced molecular orientational polarization effects on nonlinear optics. In this paper, we present the generalized molecular orientational nonlinear optical processes (MONLO) through the calculation of the classical orientational averaging using the Boltzmann type time-averaged orientational interaction energy in the randomly oriented molecular system under the influence of applied electric fields. The focal points of the calculation are (1) the derivation of rigorous tensorial components of the effective molecular hyperpolarizabilities, (2) the molecular orientational polarizations and the electronic polarizations including the well-known third-order dc polarization, dc electric field induced Kerr effect (dc Kerr effect), optical Kerr effect (OKE), dc electric field induced second harmonic generation (EFISH), degenerate four wave mixing (DFWM) and third harmonic generation (THG). We also present some of the new predictive MONLO processes. For second-order MONLO, second-order optical rectification (SOR), Pockels effect and difference frequency generation (DFG) are described in terms of the anisotropic coefficients of first hyperpolarizability. And, for third-order MONLO, third-order optical rectification (TOR), dc electric field induced difference frequency generation (EFIDFG) and pump-probe transmission are presented

  19. Palliative healthcare: cost reduction and quality enhancement using end-of-life survey methodology.

    Science.gov (United States)

    Falls, Christopher Edward

    2008-01-01

    American medical institutions throughout the 20th century prescribed high customer satisfaction, but when it came to death, largely ignored it. An accelerated accumulation of esoteric medical information and the application of this knowledge to affect new cures and longer lives instilled an unquestioning reverence for the medical community among the patient population. Diminishing marginal gains in life expectancy, escalating costs related to life sustaining technologies, and a psychographic shift in the dominant consumer base have challenged this traditional reverence. Armed with unprecedented access to medical information, a more knowledgeable and assertive patient population has emerged in the 21st century to institute its own standards of what constitutes quality health care. In terms of end of life care, this has meant recognition that the emotional needs of the dying have been largely underserved by the current American medical model. Patients and their families are no longer willing to accept the traditional medical perspective of death as failure and have numerous international palliative care models that serve as benchmarks of success when it comes to quality of dying. When cure is a possibility, Americans will pursue it at all costs, but when it is not a possibility, they want honest communication and the opportunity to say good-bye to their loved ones. In the context of these emergent needs, life review is offered as a solution. The value proposition targets not only dying patients and their families, but also society as a whole.

  20. Verification of dosimetric methodology for auditing radiotherapy quality under non-reference condition in Hubei province

    International Nuclear Information System (INIS)

    Ma Xinxing; Luo Suming; He Zhijian; Zhou Wenshan

    2014-01-01

    Objective: To verify the reliability of TLD-based quality audit for radiotherapy dosimetry of medical electron accelerator in non-reference condition by monitoring the dose variations from electron beams with different field sizes and 45° wedge and the dose variations from photon beams with different field sizes and source-skin distance. Methods: Both TLDs and finger ionization chambers were placed at a depth of 10 cm in water to measure the absorbed dose from photon beams, and also placed at the depth of maximum dose from electron beams under non-reference condition. TLDs were then mailed to National Institute for Radiological Protection, China CDC for further measurement. Results: Among the 70 measuring points for photon beams, 58 points showed the results with a relative error less than ±7.0% (IAEA's acceptable deviation: ±7.0%) between TLDs and finger ionization chambers measurements, and the percentage of qualified point numbers was 82.8%. After corrected by Ps value, 62 points were qualified and the percentage was up to 88.6%. All of the measuring points for electron beams, with the total number of 24, presented a relative error within ±5.0% (IAEA's acceptable deviation: ±5.0%) between TLDs and finger ioization cylindrical chambers measurements. Conclusions: TLD-based quality audit is convenient for determining radiotherapy dosimetric parameters of electron beams in non-reference condition and can improve the accuracy of the measuring parameters in connection with finger chambers. For electron beams of 5 MeV < E_0 < 10 MeV, the absorbed dose parameters measured by finger ionization chambers, combined with TLD audit, can help obtain the precise and reliable results. (authors)

  1. Prevalence of high blood pressure in Brazilian adolescents and quality of the employed methodological procedures: systematic review

    Directory of Open Access Journals (Sweden)

    Marina Gabriella Pereira de Andrada Magalhães

    2013-12-01

    Full Text Available OBJECTIVE: To review the literature on studies that estimated the prevalence of high blood pressure (HBP or systemic arterial hypertension (SAH in Brazilian adolescents, considering the employed methodological procedures. METHODS: Bibliographical research of prevalence studies of HBP/SAH in adolescents from 1995 to 2010. The search was conducted in the electronic databases PubMed/Medline, Lilacs, SciELO, and Isi Adolec. The descriptors "hypertension", "BP", "teen", "students", "cross-sectional", "prevalence" and "Brazil" were used in Portuguese and English. Furthermore, a score ranging from 0 to 18 based on Recommendations for Blood Pressure Measurement in Humans and Experimental Animals and the VI Brazilian Guidelines of Hypertension was elaborated, in order to analyze the procedures used to measure BP in studies. RESULTS: Twenty-one articles were identified, mostly published in the last 10 years, and 90.5% were performed in school-based and regions of the Southeast, Northeast and South. The prevalence of HBP/SAH ranged from 2.5 to 30.9%. The score of the studies ranged from 0 to 16. A significant negative correlation (rho = -0.504; p = 0.020 was observed between the prevalence of HBP/SAH and the score of BP measurement quality. CONCLUSION: The great variability of PAE/SAH estimates appears to be influenced by methodological procedures used in the studies.

  2. Artificial intelligence methodologies applied to quality control of the positioning services offered by the Red Andaluza de Posicionamiento (RAP network

    Directory of Open Access Journals (Sweden)

    Antonio José Gil

    2012-12-01

    Full Text Available On April 26, 2012, Elena Giménez de Ory defend-ed her Ph.D. thesis at University of Jaén, entitled: “Robust methodologies applied to quality control of the positioning services offered by the Red Andaluza de Posicionamiento (RAP network”. Elena Giménez de Ory defended her dissertation in a publicly open presentation held in the Higher Polytechnic School at the University of Jaén, and was able to comment on every question raised by her thesis committee and the audience. The thesis was supervised by her advisor, Prof. Antonio J. Gil Cruz, and the rest of his thesis committee, Prof. Manuel Sánchez de la Orden, Dr. Antonio Miguel Ruiz Armenteros and Dr. Gracia Rodríguez Caderot. The thesis has been read and approved by his thesis committee, receiving the highest rating. All of them were present at the presentation.

  3. Evaluation of the environmental epidemiologic data and methodology for the air quality standard in Beijing

    Science.gov (United States)

    Li, Xu; Jiang, Yanfeng; Yin, Ling; Liu, Bo; Du, Pengfei; Hassan, Mujtaba; Wang, Shigong; Li, Tanshi

    2017-09-01

    To evaluate the relationship between exposure to air pollutants and respiratory emergency room visits, a generalized additive model (GAM) was used to analyze the exposure-effect relationship between air pollutants and respiratory emergency room visits. The results showed that NO2, SO2, and PM10 have positive relationships with respiratory disease. Concentration increases of 10 μg/m3 in NO2, SO2, and PM10 corresponded to 3.90% (95%CI 3.56-4.25), 0.81% (95%CI -0.09-1.72), and 0.64% (95%CI 0.55-0.74) increases in respiratory emergency room visits. In addition, there is a strong synergic effect of PM10 and NO2 on respiratory diseases. The threshold values of the national standard grade II limits used in Beijing should be adjusted. An appropriate standard could effectively promote a significant decline in respiratory room visits and would eventually be beneficial to air quality management in residential areas.

  4. A Proposed Methodology to Assess the Quality of Public Use Management in Protected Areas

    Science.gov (United States)

    Muñoz-Santos, Maria; Benayas, Javier

    2012-07-01

    In recent years, the goal of nature preservation has faced, almost worldwide, an increase in the number of visitors who are interested in experiencing protected areas resources, landscapes and stories. Spain is a good example of this process. The rapidly increasing numbers of visitors have prompted administrations and managers to offer and develop a broad network of facilities and programs in order to provide these visitors with information, knowledge and recreation. But, are we doing it the best way? This research focuses on developing and applying a new instrument for evaluating the quality of visitor management in parks. Different areas are analyzed with this instrument (78 semi-quantitative indicators): planning and management capacity (planning, funding, human resources), monitoring, reception, information, interpretation, environmental education, training, participation and volunteer's programs. Thus, we attempt to gain a general impression of the development of the existing management model, detecting strengths and weaknesses. Although Spain's National Parks constituted the specific context within which to develop the evaluation instrument, the design thereof is intended to provide a valid, robust and flexible method for application to any system, network or set of protected areas in other countries. This paper presents the instrument developed, some results obtained following its application to Spanish National parks, along with a discussion on the limits and validity thereof.

  5. Methodology for Evaluating the Quality of Ecosystem Maps: A Case Study in the Andes

    Directory of Open Access Journals (Sweden)

    Dolors Armenteras

    2016-08-01

    Full Text Available Uncertainty in thematic maps has been tested mainly in maps with discrete or fuzzy classifications based on spectral data. However, many ecosystem maps in tropical countries consist of discrete polygons containing information on various ecosystem properties such as vegetation cover, soil, climate, geomorphology and biodiversity. The combination of these properties into one class leads to error. We propose a probability-based sampling design with two domains, multiple stages, and stratification with selection of primary sampling units (PSUs proportional to the richness of strata present. Validation is undertaken through field visits and fine resolution remote sensing data. A pilot site in the center of the Colombian Andes was chosen to validate a government official ecosystem map. Twenty primary sampling units (PSUs of 10 × 15 km were selected, and the final numbers of final sampling units (FSUs were 76 for the terrestrial domain and 46 for the aquatic domain. Our results showed a confidence level of 95%, with the accuracy in the terrestrial domain varying between 51.8% and 64.3% and in the aquatic domain varying between 75% and 92%. Governments need to account for uncertainty since they rely on the quality of these maps to make decisions and guide policies.

  6. Comparison of tools for assessing the methodological quality of primary and secondary studies in health technology assessment reports in Germany

    Directory of Open Access Journals (Sweden)

    Walter, Ulla

    2010-01-01

    Full Text Available Health care policy background: Findings from scientific studies form the basis for evidence-based health policy decisions. Scientific background: Quality assessments to evaluate the credibility of study results are an essential part of health technology assessment reports and systematic reviews. Quality assessment tools (QAT for assessing the study quality examine to what extent study results are systematically distorted by confounding or bias (internal validity. The tools can be divided into checklists, scales and component ratings. Research questions: What QAT are available to assess the quality of interventional studies or studies in the field of health economics, how do they differ from each other and what conclusions can be drawn from these results for quality assessments? Methods: A systematic search of relevant databases from 1988 onwards is done, supplemented by screening of the references, of the HTA reports of the German Agency for Health Technology Assessment (DAHTA and an internet search. The selection of relevant literature, the data extraction and the quality assessment are carried out by two independent reviewers. The substantive elements of the QAT are extracted using a modified criteria list consisting of items and domains specific to randomized trials, observational studies, diagnostic studies, systematic reviews and health economic studies. Based on the number of covered items and domains, more and less comprehensive QAT are distinguished. In order to exchange experiences regarding problems in the practical application of tools, a workshop is hosted. Results: A total of eight systematic methodological reviews is identified as well as 147 QAT: 15 for systematic reviews, 80 for randomized trials, 30 for observational studies, 17 for diagnostic studies and 22 for health economic studies. The tools vary considerably with regard to the content, the performance and quality of operationalisation. Some tools do not only include the

  7. Comparison of tools for assessing the methodological quality of primary and secondary studies in health technology assessment reports in Germany.

    Science.gov (United States)

    Dreier, Maren; Borutta, Birgit; Stahmeyer, Jona; Krauth, Christian; Walter, Ulla

    2010-06-14

    HEALTH CARE POLICY BACKGROUND: Findings from scientific studies form the basis for evidence-based health policy decisions. Quality assessments to evaluate the credibility of study results are an essential part of health technology assessment reports and systematic reviews. Quality assessment tools (QAT) for assessing the study quality examine to what extent study results are systematically distorted by confounding or bias (internal validity). The tools can be divided into checklists, scales and component ratings. What QAT are available to assess the quality of interventional studies or studies in the field of health economics, how do they differ from each other and what conclusions can be drawn from these results for quality assessments? A systematic search of relevant databases from 1988 onwards is done, supplemented by screening of the references, of the HTA reports of the German Agency for Health Technology Assessment (DAHTA) and an internet search. The selection of relevant literature, the data extraction and the quality assessment are carried out by two independent reviewers. The substantive elements of the QAT are extracted using a modified criteria list consisting of items and domains specific to randomized trials, observational studies, diagnostic studies, systematic reviews and health economic studies. Based on the number of covered items and domains, more and less comprehensive QAT are distinguished. In order to exchange experiences regarding problems in the practical application of tools, a workshop is hosted. A total of eight systematic methodological reviews is identified as well as 147 QAT: 15 for systematic reviews, 80 for randomized trials, 30 for observational studies, 17 for diagnostic studies and 22 for health economic studies. The tools vary considerably with regard to the content, the performance and quality of operationalisation. Some tools do not only include the items of internal validity but also the items of quality of reporting and

  8. Rigor or mortis: best practices for preclinical research in neuroscience.

    Science.gov (United States)

    Steward, Oswald; Balice-Gordon, Rita

    2014-11-05

    Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. [Experimental study of restiffening of the rigor mortis].

    Science.gov (United States)

    Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M

    2001-11-01

    To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.

  10. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  11. [Methodological quality evaluation of randomized controlled trials for traditional Chinese medicines for treatment of sub-health].

    Science.gov (United States)

    Zhao, Jun; Liao, Xing; Zhao, Hui; Li, Zhi-Geng; Wang, Nan-Yue; Wang, Li-Min

    2016-11-01

    To evaluate the methodological quality of the randomized controlled trials(RCTs) for traditional Chinese medicines for treatment of sub-health, in order to provide a scientific basis for the improvement of clinical trials and systematic review. Such databases as CNKI, CBM, VIP, Wanfang, EMbase, Medline, Clinical Trials, Web of Science and Cochrane Library were searched for RCTS for traditional Chinese medicines for treatment of sub-health between the time of establishment and February 29, 2016. Cochrane Handbook 5.1 was used to screen literatures and extract data, and CONSORT statement and CONSORT for traditional Chinese medicine statement were adopted as the basis for quality evaluation. Among the 72 RCTs included in this study, 67 (93.05%) trials described the inter-group baseline data comparability, 39(54.17%) trials described the unified diagnostic criteria, 28(38.89%) trials described the unified standards of efficacy, 4 (5.55%) trials mentioned the multi-center study, 19(26.38%) trials disclosed the random distribution method, 6(8.33%) trials used the random distribution concealment, 15(20.83%) trials adopted the method of blindness, 3(4.17%) study reported the sample size estimation in details, 5 (6.94%) trials showed a sample size of more than two hundred, 19(26.38%) trials reported the number of withdrawal, defluxion cases and those lost to follow-up, but only 2 trials adopted the ITT analysis,10(13.89%) trials reported the follow-up results, none of the trial reported the test registration and the test protocol, 48(66.7%) trials reported all of the indicators of expected outcomes, 26(36.11%) trials reported the adverse reactions and adverse events, and 4(5.56%) trials reported patient compliance. The overall quality of these randomized controlled trials for traditional Chinese medicines for treatment of sub-health is low, with methodological defects in different degrees. Therefore, it is still necessary to emphasize the correct application of principles

  12. Long persistence of rigor mortis at constant low temperature.

    Science.gov (United States)

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  13. Rigorous solution to Bargmann-Wigner equation for integer spin

    CERN Document Server

    Huang Shi Zhong; Wu Ning; Zheng Zhi Peng

    2002-01-01

    A rigorous method is developed to solve the Bargamann-Wigner equation for arbitrary integer spin in coordinate representation in a step by step way. The Bargmann-Wigner equation is first transformed to a form easier to solve, the new equations are then solved rigorously in coordinate representation, and the wave functions in a closed form are thus derived

  14. Using grounded theory as a method for rigorously reviewing literature

    NARCIS (Netherlands)

    Wolfswinkel, J.; Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.

    2013-01-01

    This paper offers guidance to conducting a rigorous literature review. We present this in the form of a five-stage process in which we use Grounded Theory as a method. We first probe the guidelines explicated by Webster and Watson, and then we show the added value of Grounded Theory for rigorously

  15. The methodological quality of guidelines for hospital-acquired pneumonia and ventilator-associated pneumonia: A systematic review.

    Science.gov (United States)

    Ambaras Khan, R; Aziz, Z

    2018-05-02

    Clinical practice guidelines serve as a framework for physicians to make decisions and to support best practice for optimizing patient care. However, if the guidelines do not address all the important components of optimal care sufficiently, the quality and validity of the guidelines can be reduced. The objectives of this study were to systematically review current guidelines for hospital-acquired pneumonia (HAP) and ventilator-associated pneumonia (VAP), evaluate their methodological quality and highlight the similarities and differences in their recommendations for empirical antibiotic and antibiotic de-escalation strategies. This review is reported in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) statement. Electronic databases including MEDLINE, CINAHL, PubMed and EMBASE were searched up to September 2017 for relevant guidelines. Other databases such as NICE, Scottish Intercollegiate Guidelines Network (SIGN) and the websites of professional societies were also searched for relevant guidelines. The quality and reporting of included guidelines were assessed using the Appraisal of Guidelines for Research and Evaluation II (AGREE-II) instrument. Six guidelines were eligible for inclusion in our review. Among 6 domains of AGREE-II, "clarity of presentation" scored the highest (80.6%), whereas "applicability" scored the lowest (11.8%). All the guidelines supported the antibiotic de-escalation strategy, whereas the majority of the guidelines (5 of 6) recommended that empirical antibiotic therapy should be implemented in accordance with local microbiological data. All the guidelines suggested that for early-onset HAP/VAP, therapy should start with a narrow spectrum empirical antibiotic such as penicillin or cephalosporins, whereas for late-onset HAP/VAP, the guidelines recommended the use of a broader spectrum empirical antibiotic such as the penicillin extended spectrum carbapenems and glycopeptides. Expert guidelines

  16. Sugar-Sweetened Beverages and Obesity Risk in Children and Adolescents: A Systematic Analysis on How Methodological Quality May Influence Conclusions.

    Science.gov (United States)

    Bucher Della Torre, Sophie; Keller, Amélie; Laure Depeyre, Jocelyne; Kruseman, Maaike

    2016-04-01

    In the context of a worldwide high prevalence of childhood obesity, the role of sugar-sweetened beverage (SSB) consumption as a cause of excess weight gain remains controversial. Conflicting results may be due to methodological issues in original studies and in reviews. The aim of this review was to systematically analyze the methodology of studies investigating the influence of SSB consumption on risk of obesity and obesity among children and adolescents, and the studies' ability to answer this research question. A systematic review of cohort and experimental studies published until December 2013 in peer-reviewed journals was performed on Medline, CINAHL, Web of Knowledge, and ClinicalTrials.gov. Studies investigating the influence of SSB consumption on risk of obesity and obesity among children and adolescents were included, and methodological quality to answer this question was assessed independently by two investigators using the Academy of Nutrition and Dietetics Quality Criteria Checklist. Among the 32 identified studies, nine had positive quality ratings and 23 studies had at least one major methodological issue. Main methodological issues included SSB definition and inadequate measurement of exposure. Studies with positive quality ratings found an association between SSB consumption and risk of obesity or obesity (n=5) (ie, when SSB consumption increased so did obesity) or mixed results (n=4). Studies with a neutral quality rating found a positive association (n=7), mixed results (n=9), or no association (n=7). The present review shows that the majority of studies with strong methodology indicated a positive association between SSB consumption and risk of obesity or obesity, especially among overweight children. In addition, study findings highlight the need for the careful and precise measurement of the consumption of SSBs and of important confounders. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  17. Electronic symptom reporting between patient and provider for improved health care service quality: a systematic review of randomized controlled trials. part 2: methodological quality and effects.

    Science.gov (United States)

    Johansen, Monika Alise; Berntsen, Gro K Rosvold; Schuster, Tibor; Henriksen, Eva; Horsch, Alexander

    2012-10-03

    We conducted in two parts a systematic review of randomized controlled trials (RCTs) on electronic symptom reporting between patients and providers to improve health care service quality. Part 1 reviewed the typology of patient groups, health service innovations, and research targets. Four innovation categories were identified: consultation support, monitoring with clinician support, self-management with clinician support, and therapy. To assess the methodological quality of the RCTs, and summarize effects and benefits from the methodologically best studies. We searched Medline, EMBASE, PsycINFO, Cochrane Central Register of Controlled Trials, and IEEE Xplore for original studies presented in English-language articles between 1990 and November 2011. Risk of bias and feasibility were judged according to the Cochrane recommendation, and theoretical evidence and preclinical testing were evaluated according to the Framework for Design and Evaluation of Complex Interventions to Improve Health. Three authors assessed the risk of bias and two authors extracted the effect data independently. Disagreement regarding bias assessment, extraction, and interpretation of results were resolved by consensus discussions. Of 642 records identified, we included 32 articles representing 29 studies. No articles fulfilled all quality requirements. All interventions were feasible to implement in a real-life setting, and theoretical evidence was provided for almost all studies. However, preclinical testing was reported in only a third of the articles. We judged three-quarters of the articles to have low risk for random sequence allocation and approximately half of the articles to have low risk for the following biases: allocation concealment, incomplete outcome data, and selective reporting. Slightly more than one fifth of the articles were judged as low risk for blinding of outcome assessment. Only 1 article had low risk of bias for blinding of participants and personnel. We excluded 12

  18. Optimization of edible coating formulations for improving postharvest quality and shelf life of pear fruit using response surface methodology.

    Science.gov (United States)

    Nandane, A S; Dave, Rudri K; Rao, T V Ramana

    2017-01-01

    The effect of composite edible films containing soy protein isolate (SPI) in combination with additives like hydroxypropyl methylcellulose (HPMC) and olive oil on 'Babughosha' pear ( Pyrus communis L.) stored at ambient temperature (28 ± 5 °C and 60 ± 10% RH) was evaluated using Response surface methodology (RSM). A total of 30 edible coating formulations comprising of SPI (2-6%, w/v), olive oil (0.7-1.1%, v/v), HPMC (0.1-0.5%, w/v) and potassium sorbate (0-0.4% w/v) were evaluated for optimizing the most suitable combination. Quality parameters like weight loss%, TSS, pH and titrable acidity of the stored pears were selected as response variables for optimization. The optimization procedure was carried out using RSM. It was observed that the response variables were mainly effected by concentration of SPI and olive oil in the formulation. Edible coating comprising of SPI 5%, HPMC 0.40%, olive oil 1% and potassium sorbate 0.22% was found to be most suitable combination for pear fruit with predicted values of response variables indicated as weight loss% 3.50, pH 3.41, TSS 11.13 and TA% 0.513.

  19. Roselle (Hibiscus sabdariffa L.) and soybean oil effects on quality characteristics of pork patties studied by response surface methodology.

    Science.gov (United States)

    Jung, Eunkyung; Joo, Nami

    2013-07-01

    Response surface methodology was used to investigate the effect and interactions of processing variables such as roselle extract (0.1-1.3%), soybean oil (5-20%) on physicochemical, textural and sensory properties of cooked pork patties. It was found that reduction in thickness, pH, L* and b* values decreased; however, water-holding capacity, reduction in diameter and a* values increased, respectively, as the amount of roselle increased. Soybean oil addition increased water-holding capacity, reduction in thickness, b* values of the patties. The hardness depended on the roselle and soybean oil added, as its linear effect was negative at proselle and soybean oil. The maximum overall quality score (5.42) was observed when 12.5 g of soybean oil and 0.7 g of roselle extract was added. The results of this optimization study would be useful for meat industry that tends to increase the product yield for patties using the optimum levels of ingredients by RSM. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. An investigation of laser cutting quality of 22MnB5 ultra high strength steel using response surface methodology

    Science.gov (United States)

    Tahir, Abdul Fattah Mohd; Aqida, Syarifah Nur

    2017-07-01

    In hot press forming, changes of mechanical properties in boron steel blanks have been a setback in trimming the final shape components. This paper presents investigation of kerf width and heat affected zone (HAZ) of ultra high strength 22MnB5 steel cutting. Sample cutting was conducted using a 4 kW Carbon Dioxide (CO2) laser machine with 10.6 μm wavelength with the laser spot size of 0.2 mm. A response surface methodology (RSM) using three level Box-Behnken design of experiment was developed with three factors of peak power, cutting speed and duty cycle. The parameters were optimised for minimum kerf width and HAZ formation. Optical evaluation using MITUTOYO TM 505 were conducted to measure the kerf width and HAZ region. From the findings, laser duty cycle was crucial to determine cutting quality of ultra-high strength steel; followed by cutting speed and laser power. Meanwhile, low power intensity with continuous wave contributes the narrowest kerf width formation and least HAZ region.

  1. Methodological quality of diagnostic accuracy studies on non-invasive coronary CT angiography: influence of QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) items on sensitivity and specificity

    Energy Technology Data Exchange (ETDEWEB)

    Schueler, Sabine; Walther, Stefan; Schuetz, Georg M. [Humboldt-Universitaet zu Berlin, Freie Universitaet Berlin, Charite Medical School, Department of Radiology, Berlin (Germany); Schlattmann, Peter [University Hospital of Friedrich Schiller University Jena, Department of Medical Statistics, Informatics, and Documentation, Jena (Germany); Dewey, Marc [Humboldt-Universitaet zu Berlin, Freie Universitaet Berlin, Charite Medical School, Department of Radiology, Berlin (Germany); Charite, Institut fuer Radiologie, Berlin (Germany)

    2013-06-15

    To evaluate the methodological quality of diagnostic accuracy studies on coronary computed tomography (CT) angiography using the QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) tool. Each QUADAS item was individually defined to adapt it to the special requirements of studies on coronary CT angiography. Two independent investigators analysed 118 studies using 12 QUADAS items. Meta-regression and pooled analyses were performed to identify possible effects of methodological quality items on estimates of diagnostic accuracy. The overall methodological quality of coronary CT studies was merely moderate. They fulfilled a median of 7.5 out of 12 items. Only 9 of the 118 studies fulfilled more than 75 % of possible QUADAS items. One QUADAS item (''Uninterpretable Results'') showed a significant influence (P = 0.02) on estimates of diagnostic accuracy with ''no fulfilment'' increasing specificity from 86 to 90 %. Furthermore, pooled analysis revealed that each QUADAS item that is not fulfilled has the potential to change estimates of diagnostic accuracy. The methodological quality of studies investigating the diagnostic accuracy of non-invasive coronary CT is only moderate and was found to affect the sensitivity and specificity. An improvement is highly desirable because good methodology is crucial for adequately assessing imaging technologies. (orig.)

  2. Methodological quality of diagnostic accuracy studies on non-invasive coronary CT angiography: influence of QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) items on sensitivity and specificity

    International Nuclear Information System (INIS)

    Schueler, Sabine; Walther, Stefan; Schuetz, Georg M.; Schlattmann, Peter; Dewey, Marc

    2013-01-01

    To evaluate the methodological quality of diagnostic accuracy studies on coronary computed tomography (CT) angiography using the QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) tool. Each QUADAS item was individually defined to adapt it to the special requirements of studies on coronary CT angiography. Two independent investigators analysed 118 studies using 12 QUADAS items. Meta-regression and pooled analyses were performed to identify possible effects of methodological quality items on estimates of diagnostic accuracy. The overall methodological quality of coronary CT studies was merely moderate. They fulfilled a median of 7.5 out of 12 items. Only 9 of the 118 studies fulfilled more than 75 % of possible QUADAS items. One QUADAS item (''Uninterpretable Results'') showed a significant influence (P = 0.02) on estimates of diagnostic accuracy with ''no fulfilment'' increasing specificity from 86 to 90 %. Furthermore, pooled analysis revealed that each QUADAS item that is not fulfilled has the potential to change estimates of diagnostic accuracy. The methodological quality of studies investigating the diagnostic accuracy of non-invasive coronary CT is only moderate and was found to affect the sensitivity and specificity. An improvement is highly desirable because good methodology is crucial for adequately assessing imaging technologies. (orig.)

  3. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  4. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  5. Investigating the association between medication adherence and health-related quality of life in COPD : Methodological challenges when using a proxy measure of adherence

    NARCIS (Netherlands)

    Boland, Melinde R. S.; van Boven, Job F. M.; Kruis, Annemarije L.; Chavannes, Niels H.; van der Molen, Thys; Goossens, Lucas M. A.; Rutten-van Molken, Maureen P. M. H.

    Background: The association between non-adherence to medication and health-related quality-of-life (HRQoL) in Chronic Obstructive Pulmonary Disease (COPD) remains poorly understood. Different ways to deal with methodological challenges to estimate this association have probably contributed to

  6. Implementing Quality Criteria in Designing and Conducting a Sequential Quan [right arrow] Qual Mixed Methods Study of Student Engagement with Learning Applied Research Methods Online

    Science.gov (United States)

    Ivankova, Nataliya V.

    2014-01-01

    In spite of recent methodological developments related to quality assurance in mixed methods research, practical examples of how to implement quality criteria in designing and conducting sequential QUAN [right arrow] QUAL mixed methods studies to ensure the process is systematic and rigorous remain scarce. This article discusses a three-step…

  7. The quality of reporting methods and results of cost-effectiveness analyses in Spain: a methodological systematic review.

    Science.gov (United States)

    Catalá-López, Ferrán; Ridao, Manuel; Alonso-Arroyo, Adolfo; García-Altés, Anna; Cameron, Chris; González-Bermejo, Diana; Aleixandre-Benavent, Rafael; Bernal-Delgado, Enrique; Peiró, Salvador; Tabarés-Seisdedos, Rafael; Hutton, Brian

    2016-01-07

    Cost-effectiveness analysis has been recognized as an important tool to determine the efficiency of healthcare interventions and services. There is a need for evaluating the reporting of methods and results of cost-effectiveness analyses and establishing their validity. We describe and examine reporting characteristics of methods and results of cost-effectiveness analyses conducted in Spain during more than two decades. A methodological systematic review was conducted with the information obtained through an updated literature review in PubMed and complementary databases (e.g. Scopus, ISI Web of Science, National Health Service Economic Evaluation Database (NHS EED) and Health Technology Assessment (HTA) databases from Centre for Reviews and Dissemination (CRD), Índice Médico Español (IME) Índice Bibliográfico Español en Ciencias de la Salud (IBECS)). We identified cost-effectiveness analyses conducted in Spain that used quality-adjusted life years (QALYs) as outcome measures (period 1989-December 2014). Two reviewers independently extracted the data from each paper. The data were analysed descriptively. In total, 223 studies were included. Very few studies (10; 4.5 %) reported working from a protocol. Most studies (200; 89.7 %) were simulation models and included a median of 1000 patients. Only 105 (47.1 %) studies presented an adequate description of the characteristics of the target population. Most study interventions were categorized as therapeutic (189; 84.8 %) and nearly half (111; 49.8 %) considered an active alternative as the comparator. Effectiveness of data was derived from a single study in 87 (39.0 %) reports, and only few (40; 17.9 %) used evidence synthesis-based estimates. Few studies (42; 18.8 %) reported a full description of methods for QALY calculation. The majority of the studies (147; 65.9 %) reported that the study intervention produced "more costs and more QALYs" than the comparator. Most studies (200; 89.7 %) reported favourable

  8. An algorithm to assess methodological quality of nutrition and mortality cross-sectional surveys: development and application to surveys conducted in Darfur, Sudan.

    Science.gov (United States)

    Prudhon, Claudine; de Radiguès, Xavier; Dale, Nancy; Checchi, Francesco

    2011-11-09

    Nutrition and mortality surveys are the main tools whereby evidence on the health status of populations affected by disasters and armed conflict is quantified and monitored over time. Several reviews have consistently revealed a lack of rigor in many surveys. We describe an algorithm for analyzing nutritional and mortality survey reports to identify a comprehensive range of errors that may result in sampling, response, or measurement biases and score quality. We apply the algorithm to surveys conducted in Darfur, Sudan. We developed an algorithm based on internationally agreed upon methods and best practices. Penalties are attributed for a list of errors, and an overall score is built from the summation of penalties accrued by the survey as a whole. To test the algorithm reproducibility, it was independently applied by three raters on 30 randomly selected survey reports. The algorithm was further applied to more than 100 surveys conducted in Darfur, Sudan. The Intra Class Correlation coefficient was 0.79 for mortality surveys and 0.78 for nutrition surveys. The overall median quality score and range of about 100 surveys conducted in Darfur were 0.60 (0.12-0.93) and 0.675 (0.23-0.86) for mortality and nutrition surveys, respectively. They varied between the organizations conducting the surveys, with no major trend over time. Our study suggests that it is possible to systematically assess quality of surveys and reveals considerable problems with the quality of nutritional and particularly mortality surveys conducted in the Darfur crisis.

  9. A Framework for Rigorously Identifying Research Gaps in Qualitative Literature Reviews

    DEFF Research Database (Denmark)

    Müller-Bloch, Christoph; Kranz, Johann

    2015-01-01

    Identifying research gaps is a fundamental goal of literature reviewing. While it is widely acknowledged that literature reviews should identify research gaps, there are no methodological guidelines for how to identify research gaps in qualitative literature reviews ensuring rigor and replicability....... Our study addresses this gap and proposes a framework that should help scholars in this endeavor without stifling creativity. To develop the framework we thoroughly analyze the state-of-the-art procedure of identifying research gaps in 40 recent literature reviews using a grounded theory approach....... Based on the data, we subsequently derive a framework for identifying research gaps in qualitative literature reviews and demonstrate its application with an example. Our results provide a modus operandi for identifying research gaps, thus enabling scholars to conduct literature reviews more rigorously...

  10. Customer satisfaction surveys: Methodological recommendations for financial service providers

    Directory of Open Access Journals (Sweden)

    Đorđić Marko

    2010-01-01

    Full Text Available This methodological article investigates practical challenges that emerge when conducting customer satisfaction surveys (CSS for financial service providers such as banks, insurance or leasing companies, and so forth. It displays methodological recommendations in reference with: (a survey design, (b sampling, (c survey method, (d questionnaire design, and (e data acquisition. Article provides appropriate explanations that usage of: two-stage survey design, SRS method, large samples, and rigorous fieldwork preparation can enhance the overall quality of CSS in financial services. Proposed methodological recommendations can primarily be applied to the primary quantitative marketing research in retail financial services. However, majority of them can be successfully applied when conducting primary quantitative marketing research in corporate financial services as well. .

  11. [Evaluation of the methodological quality of the Rémic (microbiology guidelines - bacteriology and mycology) of the Société française de microbiologie].

    Science.gov (United States)

    Fonfrède, Michèle; Couaillac, Jean Paul; Augereau, Christine; De Moüy, Danny; Lepargneur, Jean Pierre; Szymanowicz, Anton; Watine, Joseph

    2011-01-01

    We have evaluated the methodological quality of the Rémic (microbiology guidelines - bacteriology and mycology) of the Société française de microbiologie (edition2007), using to AGREE criteria, which are consensual at an international level, in particular at the the World Health Organisation (WHO) and at the European Union. The methodological quality of the Rémic appears to be sub-optimal. These shortcomings in quality are mainly observed in AGREE domain n° 5 (applicability), in AGREE item n° 5 (patients' opinions were not considered), and in AGREE item n° 23 (conflicts of interest were not declared). The users of the Rémic must be aware of these few methodological shortcomings in order for them to be careful before they put its recommendation in practice. In conclusion, we advise the editors of the Rémic to insert at least a methodological chapter in their next edition.

  12. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  13. Model development for mechanical properties and weld quality class of friction stir welding using multi-objective Taguchi method and response surface methodology

    International Nuclear Information System (INIS)

    Mohamed, Mohamed Ackiel; Manurung, Yupiter HP; Berhan, Mohamed Nor

    2015-01-01

    This study presents the effect of the governing parameters in friction stir welding (FSW) on the mechanical properties and weld quality of a 6mm thick 6061 T651 Aluminum alloy butt joint. The main FSW parameters, the rotational and traverse speed were optimized based on multiple mechanical properties and quality features, which focus on the tensile strength, hardness and the weld quality class using the multi-objective Taguchi method (MTM). Multi signal to noise ratio (MSNR) was employed to determine the optimum welding parameters for MTM while further analysis concerning the significant level determination was accomplished via the well-established analysis of variance (ANOVA). Furthermore, the first order model for predicting the mechanical properties and weld quality class is derived by applying response surface methodology (RSM). Based on the experimental confirmation test, the proposed method can effectively estimate the mechanical properties and weld quality class which can be used to enhance the welding performance in FSW or other applications.

  14. Model development for mechanical properties and weld quality class of friction stir welding using multi-objective Taguchi method and response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, Mohamed Ackiel [University Kuala Lumpur Malaysia France Institute, Bandar Baru Bangi (Malaysia); Manurung, Yupiter HP; Berhan, Mohamed Nor [Universiti Teknologi MARA, Shah Alam (Malaysia)

    2015-06-15

    This study presents the effect of the governing parameters in friction stir welding (FSW) on the mechanical properties and weld quality of a 6mm thick 6061 T651 Aluminum alloy butt joint. The main FSW parameters, the rotational and traverse speed were optimized based on multiple mechanical properties and quality features, which focus on the tensile strength, hardness and the weld quality class using the multi-objective Taguchi method (MTM). Multi signal to noise ratio (MSNR) was employed to determine the optimum welding parameters for MTM while further analysis concerning the significant level determination was accomplished via the well-established analysis of variance (ANOVA). Furthermore, the first order model for predicting the mechanical properties and weld quality class is derived by applying response surface methodology (RSM). Based on the experimental confirmation test, the proposed method can effectively estimate the mechanical properties and weld quality class which can be used to enhance the welding performance in FSW or other applications.

  15. Measuring quality of care: considering conceptual approaches to quality indicator development and evaluation.

    Science.gov (United States)

    Stelfox, Henry T; Straus, Sharon E

    2013-12-01

    In this article, we describe one approach for developing and evaluating quality indicators. We focus on describing different conceptual approaches to quality indicator development, review one approach for developing quality indicators, outline how to evaluate quality indicators once developed, and discuss quality indicator maintenance. The key steps for developing quality indicators include specifying a clear goal for the indicators; using methodologies to incorporate evidence, expertise, and patient perspectives; and considering contextual factors and logistics of implementation. The Strategic Framework Board and the National Quality Measure Clearinghouse have developed criteria for evaluating quality indicators that complement traditional psychometric evaluations. Optimal strategies for quality indicator maintenance and dissemination have not been determined, but experiences with clinical guideline maintenance may be informative. For quality indicators to effectively guide quality improvement efforts, they must be developed, evaluated, maintained, and implemented using rigorous evidence-informed practices. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. [Qualitative research methodology in health care].

    Science.gov (United States)

    Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara

    2017-03-01

    Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, “The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals”. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.

  17. Tenderness of pre- and post rigor lamb longissimus muscle.

    Science.gov (United States)

    Geesink, Geert; Sujang, Sadi; Koohmaraie, Mohammad

    2011-08-01

    Lamb longissimus muscle (n=6) sections were cooked at different times post mortem (prerigor, at rigor, 1dayp.m., and 7 days p.m.) using two cooking methods. Using a boiling waterbath, samples were either cooked to a core temperature of 70 °C or boiled for 3h. The latter method was meant to reflect the traditional cooking method employed in countries where preparation of prerigor meat is practiced. The time postmortem at which the meat was prepared had a large effect on the tenderness (shear force) of the meat (PCooking prerigor and at rigor meat to 70 °C resulted in higher shear force values than their post rigor counterparts at 1 and 7 days p.m. (9.4 and 9.6 vs. 7.2 and 3.7 kg, respectively). The differences in tenderness between the treatment groups could be largely explained by a difference in contraction status of the meat after cooking and the effect of ageing on tenderness. Cooking pre and at rigor meat resulted in severe muscle contraction as evidenced by the differences in sarcomere length of the cooked samples. Mean sarcomere lengths in the pre and at rigor samples ranged from 1.05 to 1.20 μm. The mean sarcomere length in the post rigor samples was 1.44 μm. Cooking for 3 h at 100 °C did improve the tenderness of pre and at rigor prepared meat as compared to cooking to 70 °C, but not to the extent that ageing did. It is concluded that additional intervention methods are needed to improve the tenderness of prerigor cooked meat. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. An integrated quality function deployment and capital budgeting methodology for occupational safety and health as a systems thinking approach: the case of the construction industry.

    Science.gov (United States)

    Bas, Esra

    2014-07-01

    In this paper, an integrated methodology for Quality Function Deployment (QFD) and a 0-1 knapsack model is proposed for occupational safety and health as a systems thinking approach. The House of Quality (HoQ) in QFD methodology is a systematic tool to consider the inter-relationships between two factors. In this paper, three HoQs are used to consider the interrelationships between tasks and hazards, hazards and events, and events and preventive/protective measures. The final priority weights of events are defined by considering their project-specific preliminary weights, probability of occurrence, and effects on the victim and the company. The priority weights of the preventive/protective measures obtained in the last HoQ are fed into a 0-1 knapsack model for the investment decision. Then, the selected preventive/protective measures can be adapted to the task design. The proposed step-by-step methodology can be applied to any stage of a project to design the workplace for occupational safety and health, and continuous improvement for safety is endorsed by the closed loop characteristic of the integrated methodology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Systematic reviews and meta-analyses on psoriasis: role of funding sources, conflict of interest and bibliometric indices as predictors of methodological quality.

    Science.gov (United States)

    Gómez-García, F; Ruano, J; Aguilar-Luque, M; Gay-Mimbrera, J; Maestre-Lopez, B; Sanz-Cabanillas, J L; Carmona-Fernández, P J; González-Padilla, M; Vélez García-Nieto, A; Isla-Tejera, B

    2017-06-01

    The quality of systematic reviews and meta-analyses on psoriasis, a chronic inflammatory skin disease that severely impairs quality of life and is associated with high costs, remains unknown. To assess the methodological quality of systematic reviews published on psoriasis. After a comprehensive search in MEDLINE, Embase and the Cochrane Database (PROSPERO: CDR42016041611), the quality of studies was assessed by two raters using the Assessment of Multiple Systematic Reviews (AMSTAR) tool. Article metadata and journal-related bibliometric indices were also obtained. Systematic reviews were classified as low (0-4), moderate (5-8) or high (9-11) quality. A prediction model for methodological quality was fitted using principal component and multivariate ordinal logistic regression analyses. We classified 220 studies as high (17·2%), moderate (55·0%) or low (27·8%) quality. Lower compliance rates were found for AMSTAR question (Q)5 (list of studies provided, 11·4%), Q10 (publication bias assessed, 27·7%), Q4 (status of publication included, 39·5%) and Q1 (a priori design provided, 40·9%). Factors such as meta-analysis inclusion [odds ratio (OR) 6·22; 95% confidence interval (CI) 2·78-14·86], funding by academic institutions (OR 2·90, 95% CI 1·11-7·89), Article Influence score (OR 2·14, 95% CI 1·05-6·67), 5-year impact factor (OR 1·34, 95% CI 1·02-1·40) and article page count (OR 1·08, 95% CI 1·02-1·15) significantly predicted higher quality. A high number of authors with a conflict of interest (OR 0·90, 95% CI 0·82-0·99) was significantly associated with lower quality. The methodological quality of systematic reviews published about psoriasis remains suboptimal. The type of funding sources and author conflicts may compromise study quality, increasing the risk of bias. © 2017 British Association of Dermatologists.

  20. MR imaging of the articular cartilage of the knee with arthroscopy as gold standard: assessment of methodological quality of clinical studies

    International Nuclear Information System (INIS)

    Duchateau, Florence; Berg, Bruno C. vande

    2002-01-01

    The purpose of this study was to assess the methodological quality of articles addressing the value of MR imaging of the knee cartilage with arthroscopy as a standard. Relevant papers were selected after Medline review (MEDLINE database search including the terms ''cartilage'' ''knee'', ''MR'' and ''arthroscopy''). Two observers reviewed independently 29 selected articles to determine how each study had met 15 individual standards that had been previously developed to assess the methodological quality of clinical investigations. The following criteria were met in variable percentage of articles including adequate definition of purpose (100%), statistical analysis (90%), avoidance of verification bias (86%), patient population description (83%), reference standard (79%), review bias (79%), study design (66%), inclusion criteria (41%) and method of analysis (41.5%), avoidance of diagnostic-review bias (24%), exclusion criteria (21%), indeterminate examination results (17%), analysis criteria (14%), interobserver reliability (14%) and intraobserver reliability (7%). The assessment of the methodological quality of clinical investigations addressing the value of MR imaging in the evaluation of the articular cartilage of the knee with arthroscopy as the standard of reference demonstrated that several standards were rarely met in the literature. Efforts should be made to rely on clearly defined lesion criteria and to determine reliability of the observations. (orig.)

  1. 78 FR 63972 - Notice of Proposed Methodology for the 2014 Delaware River and Bay Water Quality Assessment Report

    Science.gov (United States)

    2013-10-25

    ... Water Quality Assessment Report AGENCY: Delaware River Basin Commission. ACTION: Notice. SUMMARY: Notice....us , with ``Water Quality Assessment 2014'' as the subject line; via fax to 609-883-9522; via U.S. Mail to DRBC, Attn: Water Quality Assessment 2014, P.O. Box 7360, West Trenton, NJ 08628-0360; via...

  2. 76 FR 50188 - Notice of Proposed Methodology for the Delaware River and Bay Integrated List Water Quality...

    Science.gov (United States)

    2011-08-12

    ... Integrated List Water Quality Assessment AGENCY: Delaware River Basin Commission. ACTION: Notice. SUMMARY... Integrated List Water Quality Assessment is available for review and comment. DATES: Comments must be... should have the phrase ``Water Quality Assessment 2012'' in the subject line and should include the name...

  3. Estimation of the breaking of rigor mortis by myotonometry.

    Science.gov (United States)

    Vain, A; Kauppila, R; Vuori, E

    1996-05-31

    Myotonometry was used to detect breaking of rigor mortis. The myotonometer is a new instrument which measures the decaying oscillations of a muscle after a brief mechanical impact. The method gives two numerical parameters for rigor mortis, namely the period and decrement of the oscillations, both of which depend on the time period elapsed after death. In the case of breaking the rigor mortis by muscle lengthening, both the oscillation period and decrement decreased, whereas, shortening the muscle caused the opposite changes. Fourteen h after breaking the stiffness characteristics of the right and left m. biceps brachii, or oscillation periods, were assimilated. However, the values for decrement of the muscle, reflecting the dissipation of mechanical energy, maintained their differences.

  4. An algorithm to assess methodological quality of nutrition and mortality cross-sectional surveys: development and application to surveys conducted in Darfur, Sudan

    Directory of Open Access Journals (Sweden)

    Prudhon Claudine

    2011-11-01

    Full Text Available Abstract Background Nutrition and mortality surveys are the main tools whereby evidence on the health status of populations affected by disasters and armed conflict is quantified and monitored over time. Several reviews have consistently revealed a lack of rigor in many surveys. We describe an algorithm for analyzing nutritional and mortality survey reports to identify a comprehensive range of errors that may result in sampling, response, or measurement biases and score quality. We apply the algorithm to surveys conducted in Darfur, Sudan. Methods We developed an algorithm based on internationally agreed upon methods and best practices. Penalties are attributed for a list of errors, and an overall score is built from the summation of penalties accrued by the survey as a whole. To test the algorithm reproducibility, it was independently applied by three raters on 30 randomly selected survey reports. The algorithm was further applied to more than 100 surveys conducted in Darfur, Sudan. Results The Intra Class Correlation coefficient was 0.79 for mortality surveys and 0.78 for nutrition surveys. The overall median quality score and range of about 100 surveys conducted in Darfur were 0.60 (0.12-0.93 and 0.675 (0.23-0.86 for mortality and nutrition surveys, respectively. They varied between the organizations conducting the surveys, with no major trend over time. Conclusion Our study suggests that it is possible to systematically assess quality of surveys and reveals considerable problems with the quality of nutritional and particularly mortality surveys conducted in the Darfur crisis.

  5. Rating the methodological quality of single-subject designs and n-of-1 trials: introducing the Single-Case Experimental Design (SCED) Scale.

    Science.gov (United States)

    Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon

    2008-08-01

    Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical

  6. Physiological studies of muscle rigor mortis in the fowl

    International Nuclear Information System (INIS)

    Nakahira, S.; Kaneko, K.; Tanaka, K.

    1990-01-01

    A simple system was developed for continuous measurement of muscle contraction during nor mortis. Longitudinal muscle strips dissected from the Peroneus Longus were suspended in a plastic tube containing liquid paraffin. Mechanical activity was transmitted to a strain-gauge transducer which is connected to a potentiometric pen-recorder. At the onset of measurement 1.2g was loaded on the muscle strip. This model was used to study the muscle response to various treatments during nor mortis. All measurements were carried out under the anaerobic condition at 17°C, except otherwise stated. 1. The present system was found to be quite useful for continuous measurement of muscle rigor course. 2. Muscle contraction under the anaerobic condition at 17°C reached a peak about 2 hours after the onset of measurement and thereafter it relaxed at a slow rate. In contrast, the aerobic condition under a high humidity resulted in a strong rigor, about three times stronger than that in the anaerobic condition. 3. Ultrasonic treatment (37, 000-47, 000Hz) at 25°C for 10 minutes resulted in a moderate muscle rigor. 4. Treatment of muscle strip with 2mM EGTA at 30°C for 30 minutes led to a relaxation of the muscle. 5. The muscle from the birds killed during anesthesia with pentobarbital sodium resulted in a slow rate of rigor, whereas the birds killed one day after hypophysectomy led to a quick muscle rigor as seen in intact controls. 6. A slight muscle rigor was observed when muscle strip was placed in a refrigerator at 0°C for 18.5 hours and thereafter temperature was kept at 17°C. (author)

  7. Implications of the Integration of Computing Methodologies into Conventional Marketing Research upon the Quality of Students' Understanding of the Concept

    Science.gov (United States)

    Ayman, Umut; Serim, Mehmet Cenk

    2004-01-01

    It has been an ongoing concern among academicians teaching social sciences to develop a better methodology to ease understanding of students. Since verbal emphasis is at the core of the concepts within such disciplines it has been observed that the adequate or desired level of conceptual understanding of the students to transforms the theories…

  8. Paper 3: Content and Rigor of Algebra Credit Recovery Courses

    Science.gov (United States)

    Walters, Kirk; Stachel, Suzanne

    2014-01-01

    This paper describes the content, organization and rigor of the f2f and online summer algebra courses that were delivered in summers 2011 and 2012. Examining the content of both types of courses is important because research suggests that algebra courses with certain features may be better than others in promoting success for struggling students.…

  9. A rigorous treatment of uncertainty quantification for Silicon damage metrics

    International Nuclear Information System (INIS)

    Griffin, P.

    2016-01-01

    These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)

  10. Effects of post mortem temperature on rigor tension, shortening and ...

    African Journals Online (AJOL)

    Fully developed rigor mortis in muscle is characterised by maximum loss of extensibility. The course of post mortem changes in ostrich muscle was studied by following isometric tension, shortening and change in pH during the first 24 h post mortem within muscle strips from the muscularis gastrocnemius, pars interna at ...

  11. Characterization of rigor mortis of longissimus dorsi and triceps ...

    African Journals Online (AJOL)

    24 h) of the longissimus dorsi (LD) and triceps brachii (TB) muscles as well as the shear force (meat tenderness) and colour were evaluated, aiming at characterizing the rigor mortis in the meat during industrial processing. Data statistic treatment demonstrated that carcass temperature and pH decreased gradually during ...

  12. Rigor, vigor, and the study of health disparities.

    Science.gov (United States)

    Adler, Nancy; Bush, Nicole R; Pantell, Matthew S

    2012-10-16

    Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents' SES on their children's health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between "rigor" and "vigor" in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities.

  13. A rigorous proof for the Landauer-Büttiker formula

    DEFF Research Database (Denmark)

    Cornean, Horia Decebal; Jensen, Arne; Moldoveanu, V.

    Recently, Avron et al. shed new light on the question of quantum transport in mesoscopic samples coupled to particle reservoirs by semi-infinite leads. They rigorously treat the case when the sample undergoes an adiabatic evolution thus generating a current through th leads, and prove the so call...

  14. Are methodological quality and completeness of reporting associated with citation-based measures of publication impact? A secondary analysis of a systematic review of dementia biomarker studies.

    Science.gov (United States)

    Mackinnon, Shona; Drozdowska, Bogna A; Hamilton, Michael; Noel-Storr, Anna H; McShane, Rupert; Quinn, Terry

    2018-03-22

    To determine whether methodological and reporting quality are associated with surrogate measures of publication impact in the field of dementia biomarker studies. We assessed dementia biomarker studies included in a previous systematic review in terms of methodological and reporting quality using the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) and Standards for Reporting of Diagnostic Accuracy (STARD), respectively. We extracted additional study and journal-related data from each publication to account for factors shown to be associated with impact in previous research. We explored associations between potential determinants and measures of publication impact in univariable and stepwise multivariable linear regression analyses. We aimed to collect data on four measures of publication impact: two traditional measures-average number of citations per year and 5-year impact factor of the publishing journal and two alternative measures-the Altmetric Attention Score and counts of electronic downloads. The systematic review included 142 studies. Due to limited data, Altmetric Attention Scores and electronic downloads were excluded from the analysis, leaving traditional metrics as the only analysed outcome measures. We found no relationship between QUADAS and traditional metrics. Citation rates were independently associated with 5-year journal impact factor (β=0.42; pcitation rates (β=0.45; pCitation rates and 5-year journal impact factor appear to measure different dimensions of impact. Citation rates were weakly associated with completeness of reporting, while neither traditional metric was related to methodological rigour. Our results suggest that high publication usage and journal outlet is not a guarantee of quality and readers should critically appraise all papers regardless of presumed impact. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted

  15. Rigorous simulation: a tool to enhance decision making

    Energy Technology Data Exchange (ETDEWEB)

    Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)

    2012-07-01

    The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct

  16. Use of Multiple Methodologies for Developing a Customer-Oriented Model of Total Quality Management in Higher Education

    Science.gov (United States)

    Sahney, Sangeeta

    2016-01-01

    Purpose: Educational institutes must embrace the principles of total quality management (TQM) if they seek to remain competitive, and survive and succeed in the long run. An educational institution must embrace the principles of quality management and incorporate them into all of their activities. Starting with a theoretical background, the paper…

  17. A review of quality assessment of the methodology used in guidelines and systematic reviews on oral mucositis.

    NARCIS (Netherlands)

    Potting, C.; Mistiaen, P.; Poot, E.; Blijlevens, N.; Donnelly, P.; Achterberg, T. van

    2009-01-01

    Aims and objectives: The objective of this study was to identify and to assess the quality of evidence-based guidelines and systematic reviews we used in the case of oral mucositis, to apply general quality criteria for the prevention and treatment of oral mucositis in patients receiving

  18. Useful, Used, and Peer Approved: The Importance of Rigor and Accessibility in Postsecondary Research and Evaluation. WISCAPE Viewpoints

    Science.gov (United States)

    Vaade, Elizabeth; McCready, Bo

    2012-01-01

    Traditionally, researchers, policymakers, and practitioners have perceived a tension between rigor and accessibility in quantitative research and evaluation in postsecondary education. However, this study indicates that both producers and consumers of these studies value high-quality work and clear findings that can reach multiple audiences. The…

  19. A Methodology for Measuring Voice Quality Using PESQ and Interactive Voice Response in the GSM Channel Designed by OpenBTS

    Directory of Open Access Journals (Sweden)

    Pavol Partila

    2013-01-01

    Full Text Available This article discusses a methodology for rating the quality of mobile calls. Majority telecommunications service from the perspective of the whole world is using mobile telephony networks. One of the problems affecting this service and its quality are landscape barriers, which prevent the spread signal. Price and complex construction of classic BTS does not allow their dense distribution. In such cases, one solution is to use OpenBTS technology. Design of OpenBTS is more available, so it can be applied to much more places and more complex points. Purpose of this measurement is a model for effective stations deployment, due to shape and distribution of local barriers that reduce signal power, and thus the quality of speech. GSM access point for our mobile terminals is OpenBTS USRP N210 station. The PESQ method for evaluating of speech quality is compared with the subjective evaluation, which provides Asterisk PBX with IVR call back. Measurement method was taken into account the call quality depending on terminal position. The measured results and its processing bring knowledge to use this technology for more complicated locations with degraded signal level and increases the quality of voice services in telecommunications.

  20. F-15 inlet/engine test techniques and distortion methodologies studies. Volume 2: Time variant data quality analysis plots

    Science.gov (United States)

    Stevens, C. H.; Spong, E. D.; Hammock, M. S.

    1978-01-01

    Time variant data quality analysis plots were used to determine if peak distortion data taken from a subscale inlet model can be used to predict peak distortion levels for a full scale flight test vehicle.

  1. Quality control methodology and implementation of X-radiation standards beams, mammography level, following the standard IEC 61267

    International Nuclear Information System (INIS)

    Correa, Eduardo de Lima

    2010-01-01

    In this work it was developed and applied a quality control program of the X radiation system (160 kV, constant potential, target of tungsten) of the Calibration Laboratory of IPEN(LCI) in the energy range relative to mammography beams (from 25 kV to 35 kV). The X radiation standards beams, level mammography, using molybdenum and aluminum as additional filtration, were established after the application of this quality control program following national and international recommendations. The reference ionization chamber has traceability to PTB and was regularly submitted to quality control tests for evaluation and analysis of its performance. The radiation qualities emerging from the X-radiation assembly (RQR-M), based on a phantom made up of an aluminum added filter (RQA-M), narrow beam condition (RQN-M) and broad beam condition (RQB-M), following the recommendations of the international standard IEC 61267 (2005) and the IAEA code of practice, TRS 457 (2007) were established. For the implantation of RQN-M and RQB-M radiation qualities, two mammography phantoms were developed. The half-value layers found are those presented by the German primary laboratory PTB, and varied from 0.35 to 1.21 mm Al. The air kerma rates were obtained for all the 15 implanted qualities. (author)

  2. Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.

    Science.gov (United States)

    Plant, D R; Lynch, G S

    2001-09-01

    1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.

  3. Einstein's Theory A Rigorous Introduction for the Mathematically Untrained

    CERN Document Server

    Grøn, Øyvind

    2011-01-01

    This book provides an introduction to the theory of relativity and the mathematics used in its processes. Three elements of the book make it stand apart from previously published books on the theory of relativity. First, the book starts at a lower mathematical level than standard books with tensor calculus of sufficient maturity to make it possible to give detailed calculations of relativistic predictions of practical experiments. Self-contained introductions are given, for example vector calculus, differential calculus and integrations. Second, in-between calculations have been included, making it possible for the non-technical reader to follow step-by-step calculations. Thirdly, the conceptual development is gradual and rigorous in order to provide the inexperienced reader with a philosophically satisfying understanding of the theory.  Einstein's Theory: A Rigorous Introduction for the Mathematically Untrained aims to provide the reader with a sound conceptual understanding of both the special and genera...

  4. Rigor mortis in an unusual position: Forensic considerations.

    Science.gov (United States)

    D'Souza, Deepak H; Harish, S; Rajesh, M; Kiran, J

    2011-07-01

    We report a case in which the dead body was found with rigor mortis in an unusual position. The dead body was lying on its back with limbs raised, defying gravity. Direction of the salivary stains on the face was also defying the gravity. We opined that the scene of occurrence of crime is unlikely to be the final place where the dead body was found. The clues were revealing a homicidal offence and an attempt to destroy the evidence. The forensic use of 'rigor mortis in an unusual position' is in furthering the investigations, and the scientific confirmation of two facts - the scene of death (occurrence) is different from the scene of disposal of dead body, and time gap between the two places.

  5. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1986-01-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first- and second-order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first-order formulation satisfies the conditions of the Hille--Yosida theorem. A foundation is laid thereby within which the domains associated with the first- and second-order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  6. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1985-05-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first and second order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first order formulation satisfies the conditions of the Hille-Yosida theorem. A foundation is laid thereby within which the domains associated with the first and second order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  7. Rigorous results on measuring the quark charge below color threshold

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1979-01-01

    Rigorous theorems are presented showing that contributions from a color nonsinglet component of the current to matrix elements of a second order electromagnetic transition are suppressed by factors inversely proportional to the energy of the color threshold. Parton models which obtain matrix elements proportional to the color average of the square of the quark charge are shown to neglect terms of the same order of magnitude as terms kept. (author)

  8. Striation Patterns of Ox Muscle in Rigor Mortis

    Science.gov (United States)

    Locker, Ronald H.

    1959-01-01

    Ox muscle in rigor mortis offers a selection of myofibrils fixed at varying degrees of contraction from sarcomere lengths of 3.7 to 0.7 µ. A study of this material by phase contrast and electron microscopy has revealed four distinct successive patterns of contraction, including besides the familiar relaxed and contracture patterns, two intermediate types (2.4 to 1.9 µ, 1.8 to 1.5 µ) not previously well described. PMID:14417790

  9. Rigorous Analysis of a Randomised Number Field Sieve

    OpenAIRE

    Lee, Jonathan; Venkatesan, Ramarathnam

    2018-01-01

    Factorisation of integers $n$ is of number theoretic and cryptographic significance. The Number Field Sieve (NFS) introduced circa 1990, is still the state of the art algorithm, but no rigorous proof that it halts or generates relationships is known. We propose and analyse an explicitly randomised variant. For each $n$, we show that these randomised variants of the NFS and Coppersmith's multiple polynomial sieve find congruences of squares in expected times matching the best-known heuristic e...

  10. Reciprocity relations in transmission electron microscopy: A rigorous derivation.

    Science.gov (United States)

    Krause, Florian F; Rosenauer, Andreas

    2017-01-01

    A concise derivation of the principle of reciprocity applied to realistic transmission electron microscopy setups is presented making use of the multislice formalism. The equivalence of images acquired in conventional and scanning mode is thereby rigorously shown. The conditions for the applicability of the found reciprocity relations is discussed. Furthermore the positions of apertures in relation to the corresponding lenses are considered, a subject which scarcely has been addressed in previous publications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.

    Science.gov (United States)

    Morse, Janice M

    2015-09-01

    Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.

  12. Quality of Work Life: Theoretical and Methodological Problems, and Presentation of a New Model and Measuring Instrument

    Science.gov (United States)

    Martel, Jean-Pierre; Dupuis, Gilles

    2006-01-01

    Purpose: Ever since the concept of Quality of Work Life (QWL) was first used over 30 years ago, a range of definitions and theoretical constructs have succeeded each other with the aim of mitigating the many problems facing the concept. A historical overview of the concept of QWL is presented here. Given the lack of consensus concerning the…

  13. 40 CFR Appendix A to Part 132 - Great Lakes Water Quality Initiative Methodologies for Development of Aquatic Life Criteria and...

    Science.gov (United States)

    2010-07-01

    .... If the acute toxicity of the material to aquatic animals has been shown to be related to a water... material to aquatic animals has been shown to be related to a water quality characteristic such as hardness... a material in the water column to which an aquatic community can be exposed briefly without...

  14. Methodology for calculating perception of the user experience of the quality of monitored integrated telecommunications operator services

    NARCIS (Netherlands)

    2011-01-01

    The invention relates to a method for calculating perception of the user experience of the quality of monitored integrated telecommunications operator services. For this purpose, data from the monitoring of user services is used, along with questionnaires previously completed by a representative

  15. Integrating cost information with health management support system: an enhanced methodology to assess health care quality drivers.

    Science.gov (United States)

    Kohli, R; Tan, J K; Piontek, F A; Ziege, D E; Groot, H

    1999-08-01

    Changes in health care delivery, reimbursement schemes, and organizational structure have required health organizations to manage the costs of providing patient care while maintaining high levels of clinical and patient satisfaction outcomes. Today, cost information, clinical outcomes, and patient satisfaction results must become more fully integrated if strategic competitiveness and benefits are to be realized in health management decision making, especially in multi-entity organizational settings. Unfortunately, traditional administrative and financial systems are not well equipped to cater to such information needs. This article presents a framework for the acquisition, generation, analysis, and reporting of cost information with clinical outcomes and patient satisfaction in the context of evolving health management and decision-support system technology. More specifically, the article focuses on an enhanced costing methodology for determining and producing improved, integrated cost-outcomes information. Implementation issues and areas for future research in cost-information management and decision-support domains are also discussed.

  16. Methodology for setup and data processing of mobile air quality measurements to assess the spatial variability of concentrations in urban environments

    International Nuclear Information System (INIS)

    Van Poppel, Martine; Peters, Jan; Bleux, Nico

    2013-01-01

    A case study is presented to illustrate a methodology for mobile monitoring in urban environments. A dataset of UFP, PM 2.5 and BC concentrations was collected. We showed that repeated mobile measurements could give insight in spatial variability of pollutants at different micro-environments in a city. Streets of contrasting traffic intensity showed increased concentrations by a factor 2–3 for UFP and BC and by 2.5 . The first quartile (P25) of the mobile measurements at an urban background zone seems to be good estimate of the urban background concentration. The local component of the pollutant concentrations was determined by background correction. The use of background correction reduced the number of runs needed to obtain representative results. The results presented, are a first attempt to establish a methodology for setup and data processing of mobile air quality measurements to assess the spatial variability of concentrations in urban environments. -- Highlights: ► Mobile measurements are used to assess the variability of air pollutants in urban environments. ► PM 2.5 , BC and UFP concentrations are presented for zones with different traffic characteristics. ► A methodology for background correction based on the mobile measurements is presented. ► The background concentration is estimated as the 25th percentile of the urban background data. ► The minimum numbers of runs for a representative estimate is reduced after background correction. -- This paper shows that the spatial variability of air pollutants in an urban environment can be assessed by a mobile monitoring methodology including background correction

  17. Quantifying Averted Disability-Adjusted Life Years as a Performance Indicator for Water Quality Interventions: A Review of Current Methodologies and Challenges

    Directory of Open Access Journals (Sweden)

    Darcy M. Anderson

    2018-06-01

    Full Text Available Sustainable access to safe drinking water protects against infectious disease and promotes overall health. Despite considerable progress toward increasing water access, safe water quality and reliable service delivery remain a challenge. Traditional financing strategies pay implementers based on inputs and activities, with minimal incentives for water quality monitoring and sustained service operation. Pay-for-performance offers an alternative financing strategy that delivers all or a portion of payment based on performance indicators of desired outputs or outcomes. A pay-for-performance approach in the water sector could quantify and incentivize health impact. Averted disability-adjusted life years (ADALYs have been used as a performance indicator to measure the burden of disease averted due to environmental health interventions. Water-related disease burden can be measured for application as an ADALYs performance indicator following either comparative risk assessment or quantitative microbial risk assessment. Comparative risk assessment models disease burden using water source type as a proxy indicator of microbial water quality, while quantitative microbial risk assessment models disease burden using concentrations of indicator pathogens. This paper compares these risk assessment methodologies, and summarizes the limitations of applying these approaches toward quantifying ADALYs as a performance indicator for water quality interventions.

  18. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  19. [A methodological approach to assessing the quality of medical health information on its way from science to the mass media].

    Science.gov (United States)

    Serong, Julia; Anhäuser, Marcus; Wormer, Holger

    2015-01-01

    A current research project deals with the question of how the quality of medical health information changes on its way from the academic journal via press releases to the news media. In an exploratory study a sample of 30 news items has been selected stage-by-stage from an adjusted total sample of 1,695 journalistic news items on medical research in 2013. Using a multidimensional set of criteria the news items as well as the corresponding academic articles, abstracts and press releases are examined by science journalists and medical experts. Together with a content analysis of the expert assessments, it will be verified to what extent established quality standards for medical journalism can be applied to medical health communication and public relations or even to studies and abstracts as well. Copyright © 2015. Published by Elsevier GmbH.

  20. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  1. Development of Process Control Methodology for Tracking the Quality and Safety of Pain, Agitation, and Sedation Management in Critical Care Units.

    Science.gov (United States)

    Walsh, Timothy S; Kydonaki, Kalliopi; Lee, Robert J; Everingham, Kirsty; Antonelli, Jean; Harkness, Ronald T; Cole, Stephen; Quasim, Tara; Ruddy, James; McDougall, Marcia; Davidson, Alan; Rutherford, John; Richards, Jonathan; Weir, Christopher J

    2016-03-01

    To develop sedation, pain, and agitation quality measures using process control methodology and evaluate their properties in clinical practice. A Sedation Quality Assessment Tool was developed and validated to capture data for 12-hour periods of nursing care. Domains included pain/discomfort and sedation-agitation behaviors; sedative, analgesic, and neuromuscular blocking drug administration; ventilation status; and conditions potentially justifying deep sedation. Predefined sedation-related adverse events were recorded daily. Using an iterative process, algorithms were developed to describe the proportion of care periods with poor limb relaxation, poor ventilator synchronization, unnecessary deep sedation, agitation, and an overall optimum sedation metric. Proportion charts described processes over time (2 monthly intervals) for each ICU. The numbers of patients treated between sedation-related adverse events were described with G charts. Automated algorithms generated charts for 12 months of sequential data. Mean values for each process were calculated, and variation within and between ICUs explored qualitatively. Eight Scottish ICUs over a 12-month period. Mechanically ventilated patients. None. The Sedation Quality Assessment Tool agitation-sedation domains correlated with the Richmond Sedation Agitation Scale score (Spearman ρ = 0.75) and were reliable in clinician-clinician (weighted kappa; κ = 0.66) and clinician-researcher (κ = 0.82) comparisons. The limb movement domain had fair correlation with Behavioral Pain Scale (ρ = 0.24) and was reliable in clinician-clinician (κ = 0.58) and clinician-researcher (κ = 0.45) comparisons. Ventilator synchronization correlated with Behavioral Pain Scale (ρ = 0.54), and reliability in clinician-clinician (κ = 0.29) and clinician-researcher (κ = 0.42) comparisons was fair-moderate. Eight hundred twenty-five patients were enrolled (range, 59-235 across ICUs), providing 12,385 care periods for evaluation (range 655

  2. Rigorous, robust and systematic: Qualitative research and its contribution to burn care. An integrative review.

    Science.gov (United States)

    Kornhaber, Rachel Anne; de Jong, A E E; McLean, L

    2015-12-01

    Qualitative methods are progressively being implemented by researchers for exploration within healthcare. However, there has been a longstanding and wide-ranging debate concerning the relative merits of qualitative research within the health care literature. This integrative review aimed to exam the contribution of qualitative research in burns care and subsequent rehabilitation. Studies were identified using an electronic search strategy using the databases PubMed, Cumulative Index of Nursing and Allied Health Literature (CINAHL), Excerpta Medica database (EMBASE) and Scopus of peer reviewed primary research in English between 2009 to April 2014 using Whittemore and Knafl's integrative review method as a guide for analysis. From the 298 papers identified, 26 research papers met the inclusion criteria. Across all studies there was an average of 22 participants involved in each study with a range of 6-53 participants conducted across 12 nations that focussed on burns prevention, paediatric burns, appropriate acquisition and delivery of burns care, pain and psychosocial implications of burns trauma. Careful and rigorous application of qualitative methodologies promotes and enriches the development of burns knowledge. In particular, the key elements in qualitative methodological process and its publication are critical in disseminating credible and methodologically sound qualitative research. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  3. METHODOLOGICAL BACKGROUND OF EXPERT ESTIMATION OF INITIAL DATA COMPLETENESS AND QUALITY ACCORDING TO THE CERTIFIED INFORMATION SECURITY SYSTEM

    Directory of Open Access Journals (Sweden)

    V. K. Fisenko

    2015-01-01

    Full Text Available Problem of information security systems certification is analyzed and the tasks of initial data analysis are carried out. The objectives, indices and decision making criteria, as well as the challenges to be addressed are formulated. It is shown that, in order to improve quality, reduce time and cost of preparation for certification, it is reasonable to use software system for automatization of the process of initial data analysis, presented by the owner of the information system.

  4. New rigorous asymptotic theorems for inverse scattering amplitudes

    International Nuclear Information System (INIS)

    Lomsadze, Sh.Yu.; Lomsadze, Yu.M.

    1984-01-01

    The rigorous asymptotic theorems both of integral and local types obtained earlier and establishing logarithmic and in some cases even power correlations aetdeen the real and imaginary parts of scattering amplitudes Fsub(+-) are extended to the inverse amplitudes 1/Fsub(+-). One also succeeds in establishing power correlations of a new type between the real and imaginary parts, both for the amplitudes themselves and for the inverse ones. All the obtained assertions are convenient to be tested in high energy experiments when the amplitudes show asymptotic behaviour

  5. An LWR design decision Methodology

    International Nuclear Information System (INIS)

    Leahy, T.J.; Rees, D.C.; Young, J.

    1982-01-01

    While all parties involved in nuclear plant regulation endeavor to make decisions which optimize the considerations of plant safety and financial impacts, these decisions are generally made without the benefit of a systematic and rigorous approach to the questions confronting the decision makers. A Design Decision Methodology has been developed which provides such a systematic approach. By employing this methodology, which makes use of currently accepted probabilistic risk assessment techniques and cost estimation, informed decisions may be made against a background of comparisons between the relative levels of safety and costs associated with various design alternatives

  6. Can green roof act as a sink for contaminants? A methodological study to evaluate runoff quality from green roofs.

    Science.gov (United States)

    Vijayaraghavan, K; Joshi, Umid Man

    2014-11-01

    The present study examines whether green roofs act as a sink or source of contaminants based on various physico-chemical parameters (pH, conductivity and total dissolved solids) and metals (Na, K, Ca, Mg, Al, Fe, Cr, Cu, Ni, Zn, Cd and Pb). The performance of green roof substrate prepared using perlite, vermiculite, sand, crushed brick, and coco-peat, was compared with local garden soil based on improvement of runoff quality. Portulaca grandiflora was used as green roof vegetation. Four different green roof configurations, with vegetated and non-vegetated systems, were examined for several artificial rain events (un-spiked and metal-spiked). In general, the vegetated green roof assemblies generated better-quality runoff with less conductivity and total metal ion concentration compared to un-vegetated assemblies. Of the different green roof configurations examined, P. grandiflora planted on green roof substrate acted as sink for various metals and showed the potential to generate better runoff. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Creating a Methodology for Coordinating High-resolution Air Quality Improvement Map and Greenhouse Gas Mitigation Strategies in Pittsburgh City

    Science.gov (United States)

    Shi, J.; Donahue, N. M.; Klima, K.; Blackhurst, M.

    2016-12-01

    In order to tradeoff global impacts of greenhouse gases with highly local impacts of conventional air pollution, researchers require a method to compare global and regional impacts. Unfortunately, we are not aware of a method that allows these to be compared, "apples-to-apples". In this research we propose a three-step model to compare possible city-wide actions to reduce greenhouse gases and conventional air pollutants. We focus on Pittsburgh, PA, a city with consistently poor air quality that is interested in reducing both greenhouse gases and conventional air pollutants. First, we use the 2013 Pittsburgh Greenhouse Gas Inventory to update the Blackhurst et al. model and conduct a greenhouse gas abatement potentials and implementation costs of proposed greenhouse gas reduction efforts. Second, we use field tests for PM2.5, NOx, SOx, organic carbon (OC) and elemental carbon (EC) data to inform a Land-use Regression Model for local air pollution at a 100m x 100m spatial level, which combined with a social cost of air pollution model (EASIUR) allows us to calculate economic social damages. Third, we combine these two models into a three-dimensional greenhouse gas cost abatement curve to understand the implementation costs and social benefits in terms of air quality improvement and greenhouse gas abatement for each potential intervention. We anticipated such results could provide policy-maker insights in green city development.

  8. Quality-productivity decision making when turning of Inconel 718 aerospace alloy: A response surface methodology approach

    Directory of Open Access Journals (Sweden)

    Hamid Tebassi

    2017-06-01

    Full Text Available Inconel 718 is among difficult to machine materials because of its abrasiveness and high strength even at high temperature. This alloy is mainly used in aircraft and aerospace industries. Therefore, it is very important to reveal and evaluate cutting tools behavior during machining of this kind of alloy. The experimental study presented in this research work has been carried out in order to elucidate surface roughness and productivity mathematical models during turning of Inconel 718 superalloy (35 HRC with SiC Whisker ceramic tool at various cutting parameters (depth of cut, feed rate, cutting speed and radius nose. A small central composite design (SCCD including 16 basics runs replicated three times (48 runs, was adopted and graphically evaluated using Fraction of design space (FDS graph, completed by a statistical analysis of variance (ANOVA. Mathematical models for surface roughness and productivity were developed and normality was improved using the Box-Cox transformation. Results show that surface roughness criterion Ra was mainly influenced by cutting speed, radius nose and feed rate, and that the depth of cut had major effect on productivity. Finally, ranges of optimized cutting conditions were proposed for serial industrial production. Industrial benefit was illustrated in terms of high surface quality accompanied with high productivity. Indeed, results show that the use of optimal cutting condition had an industrial benefit to 46.9 % as an improvement in surface quality Ra and 160.54 % in productivity MRR.

  9. A new time-series methodology for estimating relationships between elderly frailty, remaining life expectancy, and ambient air quality.

    Science.gov (United States)

    Murray, Christian J; Lipfert, Frederick W

    2012-01-01

    Many publications estimate short-term air pollution-mortality risks, but few estimate the associated changes in life-expectancies. We present a new methodology for analyzing time series of health effects, in which prior frailty is assumed to precede short-term elderly nontraumatic mortality. The model is based on a subpopulation of frail individuals whose entries and exits (deaths) are functions of daily and lagged environmental conditions: ambient temperature/season, airborne particles, and ozone. This frail susceptible population is unknown; its fluctuations cannot be observed but are estimated using maximum-likelihood methods with the Kalman filter. We used an existing 14-y set of daily data to illustrate the model and then tested the assumption of prior frailty with a new generalized model that estimates the portion of the daily death count allocated to nonfrail individuals. In this demonstration dataset, new entries into the high-risk pool are associated with lower ambient temperatures and higher concentrations of particulate matter and ozone. Accounting for these effects on antecedent frailty reduces this at-risk population, yielding frail life expectancies of 5-7 days. Associations between environmental factors and entries to the at-risk pool are about twice as strong as for mortality. Nonfrail elderly deaths are seen to make only small contributions. This new model predicts a small short-lived frail population-at-risk that is stable over a wide range of environmental conditions. The predicted effects of pollution on new entries and deaths are robust and consistent with conventional morbidity/mortality times-series studies. We recommend model verification using other suitable datasets.

  10. A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: the Study Design and Implementation Assessment Device (Study DIAD).

    Science.gov (United States)

    Valentine, Jeffrey C; Cooper, Harris

    2008-06-01

    Assessments of studies meant to evaluate the effectiveness of interventions, programs, and policies can serve an important role in the interpretation of research results. However, evidence suggests that available quality assessment tools have poor measurement characteristics and can lead to opposing conclusions when applied to the same body of studies. These tools tend to (a) be insufficiently operational, (b) rely on arbitrary post-hoc decision rules, and (c) result in a single number to represent a multidimensional construct. In response to these limitations, a multilevel and hierarchical instrument was developed in consultation with a wide range of methodological and statistical experts. The instrument focuses on the operational details of studies and results in a profile of scores instead of a single score to represent study quality. A pilot test suggested that satisfactory between-judge agreement can be obtained using well-trained raters working in naturalistic conditions. Limitations of the instrument are discussed, but these are inherent in making decisions about study quality given incomplete reporting and in the absence of strong, contextually based information about the effects of design flaws on study outcomes. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  11. SU-E-T-43: A Methodology for Quality Control of IMPT Treatment Plan Based On VMAT Plan

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, S [UT MD Anderson Cancer Center (United States); Tianjin Medical University Cancer Institute and Hospital (China); Yang, Y [UT MD Anderson Cancer Center (United States); Tianjin First Center Hospital (China); Liao, L; Wang, X; Li, H; Zhu, X; Zhang, X [UT MD Anderson Cancer Center (United States)

    2015-06-15

    Purpose: IMPT plan design is highly dependent on planner’s experiences. VMAT plan design is relatively mature and can even be automated. The quality of IMPT plan designed by in-experienced planner could be inferior to that of VMAT plan designed by experienced planner or automatic planning software. Here we introduce a method for designing IMPT plan based on VMAT plan to ensure the IMPT plan be superior to IMRT/VMAT plan for majority clinical scenario. Methods: To design a new IMPT plan, a VMAT plan is first generated either by experienced planner or by in-house developed automatic planning system. An in-house developed tool is used to generate the dose volume constrains for the IMPT plan as plan template to Eclipse TPS. The beam angles for IMPT plan are selected based on the preferred angles in the VMAT plan. IMPT plan is designed by importing the plan objectives generated from VMAT plan. Majority thoracic IMPT plans are designed using this plan approach in our center. In this work, a thoracic IMPT plan under RTOG 1308 protocol is selected to demonstrate the effectiveness and efficiency of this approach. The dosimetric indices of IMPT are compared with VMAT plan. Results: The PTV D95, lung V20, MLD, mean heart dose, esophagus D1, cord D1 are 70Gy, 31%, 17.8Gy, 25.5Gy, 73Gy, 45Gy for IMPT plan and 65.3Gy, 34%, 21.6Gy, 35Gy, 74Gy, 48Gy for VMAT plan. For majority cases, the high dose region of the normal tissue which is in proximity of PTV is comparable between IMPT and VMAT plan. The low dose region of the IMPT plan is significantly better than VMAT plan. Conclusion: Using the knowledge gained in VMAT plan design can help efficiently and effectively design high quality IMPT plan. The quality of IMPT plan can be controlled to ensure the superiority of IMPT plan compared to VMAT/IMRT plan.

  12. SU-E-T-43: A Methodology for Quality Control of IMPT Treatment Plan Based On VMAT Plan

    International Nuclear Information System (INIS)

    Jiang, S; Yang, Y; Liao, L; Wang, X; Li, H; Zhu, X; Zhang, X

    2015-01-01

    Purpose: IMPT plan design is highly dependent on planner’s experiences. VMAT plan design is relatively mature and can even be automated. The quality of IMPT plan designed by in-experienced planner could be inferior to that of VMAT plan designed by experienced planner or automatic planning software. Here we introduce a method for designing IMPT plan based on VMAT plan to ensure the IMPT plan be superior to IMRT/VMAT plan for majority clinical scenario. Methods: To design a new IMPT plan, a VMAT plan is first generated either by experienced planner or by in-house developed automatic planning system. An in-house developed tool is used to generate the dose volume constrains for the IMPT plan as plan template to Eclipse TPS. The beam angles for IMPT plan are selected based on the preferred angles in the VMAT plan. IMPT plan is designed by importing the plan objectives generated from VMAT plan. Majority thoracic IMPT plans are designed using this plan approach in our center. In this work, a thoracic IMPT plan under RTOG 1308 protocol is selected to demonstrate the effectiveness and efficiency of this approach. The dosimetric indices of IMPT are compared with VMAT plan. Results: The PTV D95, lung V20, MLD, mean heart dose, esophagus D1, cord D1 are 70Gy, 31%, 17.8Gy, 25.5Gy, 73Gy, 45Gy for IMPT plan and 65.3Gy, 34%, 21.6Gy, 35Gy, 74Gy, 48Gy for VMAT plan. For majority cases, the high dose region of the normal tissue which is in proximity of PTV is comparable between IMPT and VMAT plan. The low dose region of the IMPT plan is significantly better than VMAT plan. Conclusion: Using the knowledge gained in VMAT plan design can help efficiently and effectively design high quality IMPT plan. The quality of IMPT plan can be controlled to ensure the superiority of IMPT plan compared to VMAT/IMRT plan

  13. Consequences of using different soil texture determination methodologies for soil physical quality and unsaturated zone time lag estimates.

    Science.gov (United States)

    Fenton, O; Vero, S; Ibrahim, T G; Murphy, P N C; Sherriff, S C; Ó hUallacháin, D

    2015-11-01

    Elucidation of when the loss of pollutants, below the rooting zone in agricultural landscapes, affects water quality is important when assessing the efficacy of mitigation measures. Investigation of this inherent time lag (t(T)) is divided into unsaturated (t(u)) and saturated (t(s)) components. The duration of these components relative to each other differs depending on soil characteristics and the landscape position. The present field study focuses on tu estimation in a scenario where the saturated zone is likely to constitute a higher proportion of t(T). In such instances, or where only initial breakthrough (IBT) or centre of mass (COM) is of interest, utilisation of site and depth specific "simple" textural class or actual sand-silt-clay percentages to generate soil water characteristic curves with associated soil hydraulic parameters is acceptable. With the same data it is also possible to estimate a soil physical quality (S) parameter for each soil layer which can be used to infer many other physical, chemical and biological quality indicators. In this study, hand texturing in the field was used to determine textural classes of a soil profile. Laboratory methods, including hydrometer, pipette and laser diffraction methods were used to determine actual sand-silt-clay percentages of sections of the same soil profile. Results showed that in terms of S, hand texturing resulted in a lower index value (inferring a degraded soil) than that of pipette, hydrometer and laser equivalents. There was no difference between S index values determined using the pipette, hydrometer and laser diffraction methods. The difference between the three laboratory methods on both the IBT and COM stages of t(u) were negligible, and in this instance were unlikely to affect either groundwater monitoring decisions, or to be of consequence from a policy perspective. When t(u) estimates are made over the full depth of the vadose zone, which may extend to several metres, errors resulting from

  14. Consequences of using different soil texture determination methodologies for soil physical quality and unsaturated zone time lag estimates

    Science.gov (United States)

    Fenton, O.; Vero, S.; Ibrahim, T. G.; Murphy, P. N. C.; Sherriff, S. C.; Ó hUallacháin, D.

    2015-11-01

    Elucidation of when the loss of pollutants, below the rooting zone in agricultural landscapes, affects water quality is important when assessing the efficacy of mitigation measures. Investigation of this inherent time lag (tT) is divided into unsaturated (tu) and saturated (ts) components. The duration of these components relative to each other differs depending on soil characteristics and the landscape position. The present field study focuses on tu estimation in a scenario where the saturated zone is likely to constitute a higher proportion of tT. In such instances, or where only initial breakthrough (IBT) or centre of mass (COM) is of interest, utilisation of site and depth specific "simple" textural class or actual sand-silt-clay percentages to generate soil water characteristic curves with associated soil hydraulic parameters is acceptable. With the same data it is also possible to estimate a soil physical quality (S) parameter for each soil layer which can be used to infer many other physical, chemical and biological quality indicators. In this study, hand texturing in the field was used to determine textural classes of a soil profile. Laboratory methods, including hydrometer, pipette and laser diffraction methods were used to determine actual sand-silt-clay percentages of sections of the same soil profile. Results showed that in terms of S, hand texturing resulted in a lower index value (inferring a degraded soil) than that of pipette, hydrometer and laser equivalents. There was no difference between S index values determined using the pipette, hydrometer and laser diffraction methods. The difference between the three laboratory methods on both the IBT and COM stages of tu were negligible, and in this instance were unlikely to affect either groundwater monitoring decisions, or to be of consequence from a policy perspective. When tu estimates are made over the full depth of the vadose zone, which may extend to several metres, errors resulting from the use of

  15. Study of methodologies for quality control of 99Mo used in 99Mo/99mTc generators

    International Nuclear Information System (INIS)

    Said, Daphne de Souza

    2016-01-01

    99m Tc is the most used radionuclide in nuclear medicine. In Brazil, the 99 Mo/ 99m Tc generators are exclusively produced by Radiopharmacy Center at IPEN-CNEN/ SP, by importing 99 Mo from different suppliers. 99 Mo (t 1/2 = 66 h) is a fission product of 235 U and it can have radionuclidic impurities that are prejudicial for human health. For safe use of generators, it is necessary to perform the evaluation of 99 Mo by quality control tests in order to assess if 99 Mo complies with the specifications. The European Pharmacopoeia (EP) presents a monograph for evaluation of the quality of the [ 99 Mo] solution as sodium molybdate,that is used as raw material for 99 Mo/ 99m Tc generators production, including specification parameters (identification, radiochemical purity and radionuclidic purity), analysis methods and limits. However, it has been observed difficulties on the execution and implementation of these methods by the generators producers, with a few literature about this subject, probably due to complexity of the proposed methods. In this work, many quality control parameters of 99 Mo described in the EP monograph were evaluated. Separation methods for 99M o from its radionuclidic impurities by solid phase extraction (SPE) and TLC were studied. After SPE separation, the quantification of metals by ICP-OES to evaluate the percentage of retention of Mo and the percentage of recovery of Ru, Te and Sr using different types of cartridges were proposed, replacing radiotracers use. It was observed that the specific type of SPE cartridge recommended by the EP for separation of 99 Mo presented low recoveries for Ru, compared to other available anion exchange SPE cartridges. 99 Mo samples from different worldwide suppliers were analyzed. It was observed that quantification of 103 Ru in 99 Mo samples with decay time higher than 4 weeks is possible. An alternative method for separation of 131 I from 99 Mo showed promising results by TLC. The quantification of beta and

  16. Sonoelasticity to monitor mechanical changes during rigor and ageing.

    Science.gov (United States)

    Ayadi, A; Culioli, J; Abouelkaram, S

    2007-06-01

    We propose the use of sonoelasticity as a non-destructive method to monitor changes in the resistance of muscle fibres, unaffected by connective tissue. Vibrations were applied at low frequency to induce oscillations in soft tissues and an ultrasound transducer was used to detect the motions. The experiments were carried out on the M. biceps femoris muscles of three beef cattle. In addition to the sonoelasticity measurements, the changes in meat during rigor and ageing were followed by measurements of both the mechanical resistance of myofibres and pH. The variations of mechanical resistance and pH were compared to those of the sonoelastic variables (velocity and attenuation) at two frequencies. The relationships between pH and velocity or attenuation and between the velocity or attenuation and the stress at 20% deformation were highly correlated. We concluded that sonoelasticity is a non-destructive method that can be used to monitor mechanical changes in muscle fibers during rigor-mortis and ageing.

  17. Rigorous quantum limits on monitoring free masses and harmonic oscillators

    Science.gov (United States)

    Roy, S. M.

    2018-03-01

    There are heuristic arguments proposing that the accuracy of monitoring position of a free mass m is limited by the standard quantum limit (SQL): σ2( X (t ) ) ≥σ2( X (0 ) ) +(t2/m2) σ2( P (0 ) ) ≥ℏ t /m , where σ2( X (t ) ) and σ2( P (t ) ) denote variances of the Heisenberg representation position and momentum operators. Yuen [Phys. Rev. Lett. 51, 719 (1983), 10.1103/PhysRevLett.51.719] discovered that there are contractive states for which this result is incorrect. Here I prove universally valid rigorous quantum limits (RQL), viz. rigorous upper and lower bounds on σ2( X (t ) ) in terms of σ2( X (0 ) ) and σ2( P (0 ) ) , given by Eq. (12) for a free mass and by Eq. (36) for an oscillator. I also obtain the maximally contractive and maximally expanding states which saturate the RQL, and use the contractive states to set up an Ozawa-type measurement theory with accuracies respecting the RQL but beating the standard quantum limit. The contractive states for oscillators improve on the Schrödinger coherent states of constant variance and may be useful for gravitational wave detection and optical communication.

  18. A rigorous test for a new conceptual model for collisions

    International Nuclear Information System (INIS)

    Peixoto, E.M.A.; Mu-Tao, L.

    1979-01-01

    A rigorous theoretical foundation for the previously proposed model is formulated and applied to electron scattering by H 2 in the gas phase. An rigorous treatment of the interaction potential between the incident electron and the Hydrogen molecule is carried out to calculate Differential Cross Sections for 1 KeV electrons, using Glauber's approximation Wang's molecular wave function for the ground electronic state of H 2 . Moreover, it is shown for the first time that, when adequately done, the omission of two center terms does not adversely influence the results of molecular calculations. It is shown that the new model is far superior to the Independent Atom Model (or Independent Particle Model). The accuracy and simplicity of the new model suggest that it may be fruitfully applied to the description of other collision phenomena (e.g., in molecular beam experiments and nuclear physics). A new techniques is presented for calculations involving two center integrals within the frame work of the Glauber's approximation for scattering. (Author) [pt

  19. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Science.gov (United States)

    Gallego, Sergi; Neipp, Cristian; Estepa, Luis A.; Ortuño, Manuel; Márquez, Andrés; Francés, Jorge; Pascual, Inmaculada; Beléndez, Augusto

    2012-01-01

    There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW) or the Rigorous Coupled Wave theory (RCW). The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  20. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Directory of Open Access Journals (Sweden)

    Augusto Beléndez

    2012-08-01

    Full Text Available There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW or the Rigorous Coupled Wave theory (RCW. The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  1. Research methodological issues in evaluating herbal interventions

    Directory of Open Access Journals (Sweden)

    Dipika Bansal

    2010-02-01

    Full Text Available Dipika Bansal, Debasish Hota, Amitava ChakrabartiPostgraduate Institute of Medical Education and Research, Chandigarh, IndiaAbstract: Randomized controlled trials provide the best evidence, and is seen as the gold standard for allopathic research. Herbal therapies are not an integral part of conventional care although they are still used by patients in their health care management. These medicines need to be subjected to rigorous research to establish their effectiveness and safety. Clearly defined treatments are required and should be recorded in a manner that enables other suitably trained researchers to reproduce them reliably. Quality control of herbal products is also a prerequisite of credible clinical trials. Methodological strategies for investigating the herbal interventions and the issues regarding appropriate patient selection, randomization and blinding, placebo effects and choice of comparator, occupational standardization and the selection of appropriate study endpoints to prove efficacy are being discussed. This paper will review research options and propose some suggestions for future research design.Keywords: CAM research, herbal therapies, methodology, clinical trial

  2. “The 3/3 Strategy”: A Successful Multifaceted Hospital Wide Hand Hygiene Intervention Based on WHO and Continuous Quality Improvement Methodology

    Science.gov (United States)

    Mestre, Gabriel; Berbel, Cristina; Tortajada, Purificación; Alarcia, Margarita; Coca, Roser; Gallemi, Gema; Garcia, Irene; Fernández, Mari Mar; Aguilar, Mari Carmen; Martínez, José Antonio; Rodríguez-Baño, Jesús

    2012-01-01

    Background Only multifaceted hospital wide interventions have been successful in achieving sustained improvements in hand hygiene (HH) compliance. Methodology/Principal Findings Pre-post intervention study of HH performance at baseline (October 2007– December 2009) and during intervention, which included two phases. Phase 1 (2010) included multimodal WHO approach. Phase 2 (2011) added Continuous Quality Improvement (CQI) tools and was based on: a) Increase of alcohol hand rub (AHR) solution placement (from 0.57 dispensers/bed to 1.56); b) Increase in frequency of audits (three days every three weeks: “3/3 strategy”); c) Implementation of a standardized register form of HH corrective actions; d) Statistical Process Control (SPC) as time series analysis methodology through appropriate control charts. During the intervention period we performed 819 scheduled direct observation audits which provided data from 11,714 HH opportunities. The most remarkable findings were: a) significant improvements in HH compliance with respect to baseline (25% mean increase); b) sustained high level (82%) of HH compliance during intervention; c) significant increase in AHRs consumption over time; c) significant decrease in the rate of healthcare-acquired MRSA; d) small but significant improvements in HH compliance when comparing phase 2 to phase 1 [79.5% (95% CI: 78.2–80.7) vs 84.6% (95% CI:83.8–85.4), phygiene day”; and “negative”:73.7% as lowest HH compliance coinciding with a statutory lay-off proceeding). Conclusions/Significance CQI tools may be a key addition to WHO strategy to maintain a good HH performance over time. In addition, SPC has shown to be a powerful methodology to detect special causes in HH performance (positive and negative) and to help establishing adequate feedback to healthcare workers. PMID:23110061

  3. Experimental evaluation of rigor mortis IX. The influence of the breaking (mechanical solution) on the development of rigor mortis.

    Science.gov (United States)

    Krompecher, Thomas; Gilles, André; Brandt-Casadevall, Conception; Mangin, Patrice

    2008-04-07

    Objective measurements were carried out to study the possible re-establishment of rigor mortis on rats after "breaking" (mechanical solution). Our experiments showed that: *Cadaveric rigidity can re-establish after breaking. *A significant rigidity can reappear if the breaking occurs before the process is complete. *Rigidity will be considerably weaker after the breaking. *The time course of the intensity does not change in comparison to the controls: --the re-establishment begins immediately after the breaking; --maximal values are reached at the same time as in the controls; --the course of the resolution is the same as in the controls.

  4. Testing the validity of a translated pharmaceutical therapy-related quality of life instrument, using qualitative 'think aloud' methodology.

    Science.gov (United States)

    Renberg, T; Kettis Lindblad, A; Tully, M P

    2008-06-01

    In pharmacy practice, there is a need for valid and reliable instruments to study patient-reported outcomes. One potential candidate is a pharmaceutical therapy-related quality of life (PTRQoL) instrument. This study explored the face and content validity, including cognitive aspects of question answering of a PTRQoL instrument, translated from English to Swedish. A sample of 16 customers at Swedish community pharmacies, was asked to fill in the PTRQoL instrument while constantly reporting how they reasoned. The resulting interviews and concurrent probing, were audio-taped, transcribed verbatim and analysed using constant comparison method. The relation between the measurement and its theoretical underpinning was challenged. Respondents neglected to read the instructions, used response options in an unpredictable way, and varied in their interpretations of the items. The combination of 'think-aloud', retrospective probing and qualitative analysis informed on the validity of the PTRQoL instrument and was valuable in questionnaire development. The study also identified specific problems that could be relevant for other instruments probing patients' medicines-related attitudes and behaviour.

  5. Update on the Methodological Quality of Research Published in The American Journal of Sports Medicine: Comparing 2011-2013 to 10 and 20 Years Prior.

    Science.gov (United States)

    Brophy, Robert H; Kluck, Dylan; Marx, Robert G

    2016-05-01

    In recent years, the number of articles in The American Journal of Sports Medicine (AJSM) has risen dramatically, with an increasing emphasis on evidence-based medicine in orthopaedics and sports medicine. Despite the increase in the number of articles published in AJSM over the past decade, the methodological quality of articles in 2011-2013 has improved relative to those in 2001-2003 and 1991-1993. Meta-analysis. All articles published in AJSM during 2011-2013 were reviewed and classified by study design. For each article, the use of pertinent methodologies, such as prospective data collection, randomization, control groups, and blinding, was recorded. The frequency of each article type and the use of evidence-based techniques were compared relative to 1991-1993 and 2001-2003 by use of Pearson χ(2) testing. The number of research articles published in AJSM more than doubled from 402 in 1991-1993 and 423 in 2001-2003 to 953 in 2011-2013. Case reports decreased from 15.2% to 10.6% to 2.1% of articles published over the study period (P < .001). Cadaveric/human studies and meta-analysis/literature review studies increased from 5.7% to 7.1% to 12.4% (P < .001) and from 0.2% to 0.9% to 2.3% (P = .01), respectively. Randomized, prospective clinical trials increased from 2.7% to 5.9% to 7.4% (P = .007). Fewer studies used retrospective compared with prospective data collection (P < .001). More studies tested an explicit hypothesis (P < .001) and used controls (P < .001), randomization (P < .001), and blinding of those assessing outcomes (P < .001). Multi-investigator trials increased (P < .001), as did the proportion of articles citing a funding source (P < .001). Despite a dramatic increase in the number of published articles, the research published in AJSM shifted toward more prospective, randomized, controlled, and blinded designs during 2011-2013 compared with 2001-2003 and 1991-1993, demonstrating a continued improvement in methodological quality. © 2015 The

  6. Atlantic salmon skin and fillet color changes effected by perimortem handling stress, rigor mortis, and ice storage.

    Science.gov (United States)

    Erikson, U; Misimi, E

    2008-03-01

    The changes in skin and fillet color of anesthetized and exhausted Atlantic salmon were determined immediately after killing, during rigor mortis, and after ice storage for 7 d. Skin color (CIE L*, a*, b*, and related values) was determined by a Minolta Chroma Meter. Roche SalmoFan Lineal and Roche Color Card values were determined by a computer vision method and a sensory panel. Before color assessment, the stress levels of the 2 fish groups were characterized in terms of white muscle parameters (pH, rigor mortis, and core temperature). The results showed that perimortem handling stress initially significantly affected several color parameters of skin and fillets. Significant transient fillet color changes also occurred in the prerigor phase and during the development of rigor mortis. Our results suggested that fillet color was affected by postmortem glycolysis (pH drop, particularly in anesthetized fillets), then by onset and development of rigor mortis. The color change patterns during storage were different for the 2 groups of fish. The computer vision method was considered suitable for automated (online) quality control and grading of salmonid fillets according to color.

  7. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Reframing Rigor: A Modern Look at Challenge and Support in Higher Education

    Science.gov (United States)

    Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.

    2018-01-01

    This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.

  9. Rigorous force field optimization principles based on statistical distance minimization

    Energy Technology Data Exchange (ETDEWEB)

    Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  10. Fast and Rigorous Assignment Algorithm Multiple Preference and Calculation

    Directory of Open Access Journals (Sweden)

    Ümit Çiftçi

    2010-03-01

    Full Text Available The goal of paper is to develop an algorithm that evaluates students then places them depending on their desired choices according to dependant preferences. The developed algorithm is also used to implement software. The success and accuracy of the software as well as the algorithm are tested by applying it to ability test at Beykent University. This ability test is repeated several times in order to fill all available places at Fine Art Faculty departments in every academic year. It has been shown that this algorithm is very fast and rigorous after application of 2008-2009 and 2009-20010 academic years.Key Words: Assignment algorithm, student placement, ability test

  11. Student’s rigorous mathematical thinking based on cognitive style

    Science.gov (United States)

    Fitriyani, H.; Khasanah, U.

    2017-12-01

    The purpose of this research was to determine the rigorous mathematical thinking (RMT) of mathematics education students in solving math problems in terms of reflective and impulsive cognitive styles. The research used descriptive qualitative approach. Subjects in this research were 4 students of the reflective and impulsive cognitive style which was each consisting male and female subjects. Data collection techniques used problem-solving test and interview. Analysis of research data used Miles and Huberman model that was reduction of data, presentation of data, and conclusion. The results showed that impulsive male subjects used three levels of the cognitive function required for RMT that were qualitative thinking, quantitative thinking with precision, and relational thinking completely while the other three subjects were only able to use cognitive function at qualitative thinking level of RMT. Therefore the subject of impulsive male has a better RMT ability than the other three research subjects.

  12. Rigorous Quantum Field Theory A Festschrift for Jacques Bros

    CERN Document Server

    Monvel, Anne Boutet; Iagolnitzer, Daniel; Moschella, Ugo

    2007-01-01

    Jacques Bros has greatly advanced our present understanding of rigorous quantum field theory through numerous fundamental contributions. This book arose from an international symposium held in honour of Jacques Bros on the occasion of his 70th birthday, at the Department of Theoretical Physics of the CEA in Saclay, France. The impact of the work of Jacques Bros is evident in several articles in this book. Quantum fields are regarded as genuine mathematical objects, whose various properties and relevant physical interpretations must be studied in a well-defined mathematical framework. The key topics in this volume include analytic structures of Quantum Field Theory (QFT), renormalization group methods, gauge QFT, stability properties and extension of the axiomatic framework, QFT on models of curved spacetimes, QFT on noncommutative Minkowski spacetime. Contributors: D. Bahns, M. Bertola, R. Brunetti, D. Buchholz, A. Connes, F. Corbetta, S. Doplicher, M. Dubois-Violette, M. Dütsch, H. Epstein, C.J. Fewster, K....

  13. Desarrollo constitucional, legal y jurisprudencia del principio de rigor subsidiario

    Directory of Open Access Journals (Sweden)

    Germán Eduardo Cifuentes Sandoval

    2013-09-01

    Full Text Available In colombia the environment state administration is in charge of environmental national system, SINA, SINA is made up of states entities that coexist beneath a mixed organization of centralization and decentralization. SINA decentralization express itself in a administrative and territorial level, and is waited that entities that function under this structure act in a coordinated way in order to reach suggested objectives in the environmental national politicy. To achieve the coordinated environmental administration through entities that define the SINA, the environmental legislation of Colombia has include three basic principles: 1. The principle of “armorial regional” 2. The principle of “gradationnormative” 3. The principle of “rigorsubsidiaries”. These principles belong to the article 63, law 99 of 1933, and even in the case of the two first, it is possible to find equivalents in other norms that integrate the Colombian legal system, it does not happen in that way with the “ rigor subsidiaries” because its elements are uniques of the environmental normativity and do not seem to be similar to those that make part of the principle of “ subsidiaridad” present in the article 288 of the politic constitution. The “ rigor subsidiaries” give to decentralizates entities certain type of special ability to modify the current environmental legislation to defend the local ecological patrimony. It is an administrative ability with a foundation in the decentralization autonomy that allows to take place of the reglamentary denied of the legislative power with the condition that the new normativity be more demanding that the one that belongs to the central level

  14. Rigorous patient-prosthesis matching of Perimount Magna aortic bioprosthesis.

    Science.gov (United States)

    Nakamura, Hiromasa; Yamaguchi, Hiroki; Takagaki, Masami; Kadowaki, Tasuku; Nakao, Tatsuya; Amano, Atsushi

    2015-03-01

    Severe patient-prosthesis mismatch, defined as effective orifice area index ≤0.65 cm(2) m(-2), has demonstrated poor long-term survival after aortic valve replacement. Reported rates of severe mismatch involving the Perimount Magna aortic bioprosthesis range from 4% to 20% in patients with a small annulus. Between June 2008 and August 2011, 251 patients (mean age 70.5 ± 10.2 years; mean body surface area 1.55 ± 0.19 m(2)) underwent aortic valve replacement with a Perimount Magna bioprosthesis, with or without concomitant procedures. We performed our procedure with rigorous patient-prosthesis matching to implant a valve appropriately sized to each patient, and carried out annular enlargement when a 19-mm valve did not fit. The bioprosthetic performance was evaluated by transthoracic echocardiography predischarge and at 1 and 2 years after surgery. Overall hospital mortality was 1.6%. Only 5 (2.0%) patients required annular enlargement. The mean follow-up period was 19.1 ± 10.7 months with a 98.4% completion rate. Predischarge data showed a mean effective orifice area index of 1.21 ± 0.20 cm(2) m(-2). Moderate mismatch, defined as effective orifice area index ≤0.85 cm(2) m(-2), developed in 4 (1.6%) patients. None developed severe mismatch. Data at 1 and 2 years showed only two cases of moderate mismatch; neither was severe. Rigorous patient-prosthesis matching maximized the performance of the Perimount Magna, and no severe mismatch resulted in this Japanese population of aortic valve replacement patients. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  15. Assessing the service quality in Software-as-a-Service from the customers’ perspective: a methodological approach and case of use

    Directory of Open Access Journals (Sweden)

    André Luís Policani Freitas

    2017-11-01

    Full Text Available Abstract Despite the advances in the evaluation of the quality of Software-as-a-Service (SaaS, new studies seem to be necessary, since the existing criticisms concerning the SERVQUAL scale and the lack of studies concerning the use of SERVPERF for this purpose. This work aims to fulfil this gap by proposing a methodological approach to assess the SaaS service quality by measuring SaaS customers’ satisfaction. Factor analysis is used to summarize the information contained in the original items into a smaller set of new dimensions and Quartile analysis is suggested to determine the most critical items. By conducting a study, the factors that most influence customer’s satisfaction are customer service, customer assistance and the reliability of SaaS. Most of the critical items are associated with the transparency and accuracy in correcting errors, the company's interest in solving customer problems, SaaS application's ability to meet business requirements, implemented updates and regularity of service performance.

  16. Control group design: enhancing rigor in research of mind-body therapies for depression.

    Science.gov (United States)

    Kinser, Patricia Anne; Robins, Jo Lynne

    2013-01-01

    Although a growing body of research suggests that mind-body therapies may be appropriate to integrate into the treatment of depression, studies consistently lack methodological sophistication particularly in the area of control groups. In order to better understand the relationship between control group selection and methodological rigor, we provide a brief review of the literature on control group design in yoga and tai chi studies for depression, and we discuss challenges we have faced in the design of control groups for our recent clinical trials of these mind-body complementary therapies for women with depression. To address the multiple challenges of research about mind-body therapies, we suggest that researchers should consider 4 key questions: whether the study design matches the research question; whether the control group addresses performance, expectation, and detection bias; whether the control group is ethical, feasible, and attractive; and whether the control group is designed to adequately control for nonspecific intervention effects. Based on these questions, we provide specific recommendations about control group design with the goal of minimizing bias and maximizing validity in future research.

  17. The Theoretical, Methodological, and Practical Background for Looking at International Students' Learning Styles, Backgrounds, and Quality Perceptions with Regard to ASB's Three English-Language M.Sc. Programs

    DEFF Research Database (Denmark)

    Skaates, Maria Anne

    This paper looking at the methodological background for a questionnaire study of student perceptions with regard to own learning styles and backgrounds as well as the quality of education in Aarhus School of Business' 3 English-language M.Sc. programs (FIB, EU, and BPM). Theories and models about...... the internationalization of business schools, service quality, and student learning styles are discussed in the paper, as is the statistic methodology for treating the questionnaire responses of the respondents.......This paper looking at the methodological background for a questionnaire study of student perceptions with regard to own learning styles and backgrounds as well as the quality of education in Aarhus School of Business' 3 English-language M.Sc. programs (FIB, EU, and BPM). Theories and models about...

  18. Rigorous Multicomponent Reactive Separations Modelling: Complete Consideration of Reaction-Diffusion Phenomena

    International Nuclear Information System (INIS)

    Ahmadi, A.; Meyer, M.; Rouzineau, D.; Prevost, M.; Alix, P.; Laloue, N.

    2010-01-01

    This paper gives the first step of the development of a rigorous multicomponent reactive separation model. Such a model is highly essential to further the optimization of acid gases removal plants (CO 2 capture, gas treating, etc.) in terms of size and energy consumption, since chemical solvents are conventionally used. Firstly, two main modelling approaches are presented: the equilibrium-based and the rate-based approaches. Secondly, an extended rate-based model with rigorous modelling methodology for diffusion-reaction phenomena is proposed. The film theory and the generalized Maxwell-Stefan equations are used in order to characterize multicomponent interactions. The complete chain of chemical reactions is taken into account. The reactions can be kinetically controlled or at chemical equilibrium, and they are considered for both liquid film and liquid bulk. Thirdly, the method of numerical resolution is described. Coupling the generalized Maxwell-Stefan equations with chemical equilibrium equations leads to a highly non-linear Differential-Algebraic Equations system known as DAE index 3. The set of equations is discretized with finite-differences as its integration by Gear method is complex. The resulting algebraic system is resolved by the Newton- Raphson method. Finally, the present model and the associated methods of numerical resolution are validated for the example of esterification of methanol. This archetype non-electrolytic system permits an interesting analysis of reaction impact on mass transfer, especially near the phase interface. The numerical resolution of the model by Newton-Raphson method gives good results in terms of calculation time and convergence. The simulations show that the impact of reactions at chemical equilibrium and that of kinetically controlled reactions with high kinetics on mass transfer is relatively similar. Moreover, the Fick's law is less adapted for multicomponent mixtures where some abnormalities such as counter

  19. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    Science.gov (United States)

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Revisiting the scientific method to improve rigor and reproducibility of immunohistochemistry in reproductive science.

    Science.gov (United States)

    Manuel, Sharrón L; Johnson, Brian W; Frevert, Charles W; Duncan, Francesca E

    2018-04-21

    Immunohistochemistry (IHC) is a robust scientific tool whereby cellular components are visualized within a tissue, and this method has been and continues to be a mainstay for many reproductive biologists. IHC is highly informative if performed and interpreted correctly, but studies have shown that the general use and reporting of appropriate controls in IHC experiments is low. This omission of the scientific method can result in data that lacks rigor and reproducibility. In this editorial, we highlight key concepts in IHC controls and describe an opportunity for our field to partner with the Histochemical Society to adopt their IHC guidelines broadly as researchers, authors, ad hoc reviewers, editorial board members, and editors-in-chief. Such cross-professional society interactions will ensure that we produce the highest quality data as new technologies emerge that still rely upon the foundations of classic histological and immunohistochemical principles.

  1. Qualitative methodology in a psychoanalytic single case study

    DEFF Research Database (Denmark)

    Grünbaum, Liselotte

    features and breaks in psychotherapy investigated. One aim of the study was to contribute to the development of a transparent and systematic methodology for the psychoanalytic case study by application of rigorous qualitative research methodology. To this end, inductive-deductive principles in line...

  2. Memory sparing, fast scattering formalism for rigorous diffraction modeling

    Science.gov (United States)

    Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.

    2017-07-01

    The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.

  3. Rigorous Results for the Distribution of Money on Connected Graphs

    Science.gov (United States)

    Lanchier, Nicolas; Reed, Stephanie

    2018-05-01

    This paper is concerned with general spatially explicit versions of three stochastic models for the dynamics of money that have been introduced and studied numerically by statistical physicists: the uniform reshuffling model, the immediate exchange model and the model with saving propensity. All three models consist of systems of economical agents that consecutively engage in pairwise monetary transactions. Computer simulations performed in the physics literature suggest that, when the number of agents and the average amount of money per agent are large, the limiting distribution of money as time goes to infinity approaches the exponential distribution for the first model, the gamma distribution with shape parameter two for the second model and a distribution similar but not exactly equal to a gamma distribution whose shape parameter depends on the saving propensity for the third model. The main objective of this paper is to give rigorous proofs of these conjectures and also extend these conjectures to generalizations of the first two models and a variant of the third model that include local rather than global interactions, i.e., instead of choosing the two interacting agents uniformly at random from the system, the agents are located on the vertex set of a general connected graph and can only interact with their neighbors.

  4. Rigorous vector wave propagation for arbitrary flat media

    Science.gov (United States)

    Bos, Steven P.; Haffert, Sebastiaan Y.; Keller, Christoph U.

    2017-08-01

    Precise modelling of the (off-axis) point spread function (PSF) to identify geometrical and polarization aberrations is important for many optical systems. In order to characterise the PSF of the system in all Stokes parameters, an end-to-end simulation of the system has to be performed in which Maxwell's equations are rigorously solved. We present the first results of a python code that we are developing to perform multiscale end-to-end wave propagation simulations that include all relevant physics. Currently we can handle plane-parallel near- and far-field vector diffraction effects of propagating waves in homogeneous isotropic and anisotropic materials, refraction and reflection of flat parallel surfaces, interference effects in thin films and unpolarized light. We show that the code has a numerical precision on the order of 10-16 for non-absorbing isotropic and anisotropic materials. For absorbing materials the precision is on the order of 10-8. The capabilities of the code are demonstrated by simulating a converging beam reflecting from a flat aluminium mirror at normal incidence.

  5. Dynamics of harmonically-confined systems: Some rigorous results

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Zhigang, E-mail: zwu@physics.queensu.ca; Zaremba, Eugene, E-mail: zaremba@sparky.phy.queensu.ca

    2014-03-15

    In this paper we consider the dynamics of harmonically-confined atomic gases. We present various general results which are independent of particle statistics, interatomic interactions and dimensionality. Of particular interest is the response of the system to external perturbations which can be either static or dynamic in nature. We prove an extended Harmonic Potential Theorem which is useful in determining the damping of the centre of mass motion when the system is prepared initially in a highly nonequilibrium state. We also study the response of the gas to a dynamic external potential whose position is made to oscillate sinusoidally in a given direction. We show in this case that either the energy absorption rate or the centre of mass dynamics can serve as a probe of the optical conductivity of the system. -- Highlights: •We derive various rigorous results on the dynamics of harmonically-confined atomic gases. •We derive an extension of the Harmonic Potential Theorem. •We demonstrate the link between the energy absorption rate in a harmonically-confined system and the optical conductivity.

  6. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    Science.gov (United States)

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  7. PRO development: rigorous qualitative research as the crucial foundation.

    Science.gov (United States)

    Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-10-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.

  8. Rigorous derivation of porous-media phase-field equations

    Science.gov (United States)

    Schmuck, Markus; Kalliadasis, Serafim

    2017-11-01

    The evolution of interfaces in Complex heterogeneous Multiphase Systems (CheMSs) plays a fundamental role in a wide range of scientific fields such as thermodynamic modelling of phase transitions, materials science, or as a computational tool for interfacial flow studies or material design. Here, we focus on phase-field equations in CheMSs such as porous media. To the best of our knowledge, we present the first rigorous derivation of error estimates for fourth order, upscaled, and nonlinear evolution equations. For CheMs with heterogeneity ɛ, we obtain the convergence rate ɛ 1 / 4 , which governs the error between the solution of the new upscaled formulation and the solution of the microscopic phase-field problem. This error behaviour has recently been validated computationally in. Due to the wide range of application of phase-field equations, we expect this upscaled formulation to allow for new modelling, analytic, and computational perspectives for interfacial transport and phase transformations in CheMSs. This work was supported by EPSRC, UK, through Grant Nos. EP/H034587/1, EP/L027186/1, EP/L025159/1, EP/L020564/1, EP/K008595/1, and EP/P011713/1 and from ERC via Advanced Grant No. 247031.

  9. Rigorous time slicing approach to Feynman path integrals

    CERN Document Server

    Fujiwara, Daisuke

    2017-01-01

    This book proves that Feynman's original definition of the path integral actually converges to the fundamental solution of the Schrödinger equation at least in the short term if the potential is differentiable sufficiently many times and its derivatives of order equal to or higher than two are bounded. The semi-classical asymptotic formula up to the second term of the fundamental solution is also proved by a method different from that of Birkhoff. A bound of the remainder term is also proved. The Feynman path integral is a method of quantization using the Lagrangian function, whereas Schrödinger's quantization uses the Hamiltonian function. These two methods are believed to be equivalent. But equivalence is not fully proved mathematically, because, compared with Schrödinger's method, there is still much to be done concerning rigorous mathematical treatment of Feynman's method. Feynman himself defined a path integral as the limit of a sequence of integrals over finite-dimensional spaces which is obtained by...

  10. A multi-methodological approach to study the temporal and spatial distribution of air quality related to road transport emissions in Madrid, Spain

    Science.gov (United States)

    Perez, Pedro; Miranda, Regina

    2013-04-01

    emission inventory, together with the mobile source's parameters and the disaggregated transport activity data. The paper will also identify emission and concentration differences and gradients of certain magnitude/factor (e.g. comparison between estimated ATPs hourly concentrations in Madrid City Center and in the peripheries). Furthermore, because of the higher contribution of road mobile sources to GHGs and ATPs emissions in Madrid, small gradients between urban highways and residential areas will be expected. Second, the paper objectives are to develop valid methods and approaches to measure air quality and to develop valid road transport emission inventories to assess correlations between external costs, epidemiology and emissions in order to reveal how traffic pollution affects people exposure to key contaminants and disease development, and identify susceptible emission scenarios and health impacts. We have conducted general emission inventory studies providing preliminary evidence of regional road transport air pollution impacts on external cost growth and disease development. Third, we also aim to demonstrate short and long-term impacts of road transport emissions on external costs development using innovative multi-methodological methods interfaced with environmental chemistry and meteorology following meteorological and chemical fields with contrasting high/low traffic emissions in several linked components involving: air pollutant assessment using local measurements, height of the boundary layer, meteorological environment interactions on external costs and epidemiology, mapping of Madrid (identifying gradients of emissions), integrative causal modeling using statistical models, and trend and scenario analyses on external costs and impacts on human health. Meteorological and chemical fields will be obtained from local records collected by surface meteorological and air quality stations. These two sets of fields define the horizontal and vertical profiles of

  11. The COSMIN checklist for assessing the methodological quality of studies on measurement properties of health status measurement instruments: an international Delphi study.

    Science.gov (United States)

    Mokkink, Lidwine B; Terwee, Caroline B; Patrick, Donald L; Alonso, Jordi; Stratford, Paul W; Knol, Dirk L; Bouter, Lex M; de Vet, Henrica C W

    2010-05-01

    Aim of the COSMIN study (COnsensus-based Standards for the selection of health status Measurement INstruments) was to develop a consensus-based checklist to evaluate the methodological quality of studies on measurement properties. We present the COSMIN checklist and the agreement of the panel on the items of the checklist. A four-round Delphi study was performed with international experts (psychologists, epidemiologists, statisticians and clinicians). Of the 91 invited experts, 57 agreed to participate (63%). Panel members were asked to rate their (dis)agreement with each proposal on a five-point scale. Consensus was considered to be reached when at least 67% of the panel members indicated 'agree' or 'strongly agree'. Consensus was reached on the inclusion of the following measurement properties: internal consistency, reliability, measurement error, content validity (including face validity), construct validity (including structural validity, hypotheses testing and cross-cultural validity), criterion validity, responsiveness, and interpretability. The latter was not considered a measurement property. The panel also reached consensus on how these properties should be assessed. The resulting COSMIN checklist could be useful when selecting a measurement instrument, peer-reviewing a manuscript, designing or reporting a study on measurement properties, or for educational purposes.

  12. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  13. Importance of methodology on (99m)technetium dimercapto-succinic acid scintigraphic image quality: imaging pilot study for RIVUR (Randomized Intervention for Children With Vesicoureteral Reflux) multicenter investigation.

    Science.gov (United States)

    Ziessman, Harvey A; Majd, Massoud

    2009-07-01

    We reviewed our experience with (99m)technetium dimercapto-succinic acid scintigraphy obtained during an imaging pilot study for a multicenter investigation (Randomized Intervention for Children With Vesicoureteral Reflux) of the effectiveness of daily antimicrobial prophylaxis for preventing recurrent urinary tract infection and renal scarring. We analyzed imaging methodology and its relation to diagnostic image quality. (99m)Technetium dimercapto-succinic acid imaging guidelines were provided to participating sites. High-resolution planar imaging with parallel hole or pinhole collimation was required. Two core reviewers evaluated all submitted images. Analysis included appropriate views, presence or lack of patient motion, adequate magnification, sufficient counts and diagnostic image quality. Inter-reader agreement was evaluated. We evaluated 70, (99m)technetium dimercapto-succinic acid studies from 14 institutions. Variability was noted in methodology and image quality. Correlation (r value) between dose administered and patient age was 0.780. For parallel hole collimator imaging good correlation was noted between activity administered and counts (r = 0.800). For pinhole imaging the correlation was poor (r = 0.110). A total of 10 studies (17%) were rejected for quality issues of motion, kidney overlap, inadequate magnification, inadequate counts and poor quality images. The submitting institution was informed and provided with recommendations for improving quality, and resubmission of another study was required. Only 4 studies (6%) were judged differently by the 2 reviewers, and the differences were minor. Methodology and image quality for (99m)technetium dimercapto-succinic acid scintigraphy varied more than expected between institutions. The most common reason for poor image quality was inadequate count acquisition with insufficient attention to the tradeoff between administered dose, length of image acquisition, start time of imaging and resulting image

  14. Stochastic Geometry and Quantum Gravity: Some Rigorous Results

    Science.gov (United States)

    Zessin, H.

    The aim of these lectures is a short introduction into some recent developments in stochastic geometry which have one of its origins in simplicial gravity theory (see Regge Nuovo Cimento 19: 558-571, 1961). The aim is to define and construct rigorously point processes on spaces of Euclidean simplices in such a way that the configurations of these simplices are simplicial complexes. The main interest then is concentrated on their curvature properties. We illustrate certain basic ideas from a mathematical point of view. An excellent representation of this area can be found in Schneider and Weil (Stochastic and Integral Geometry, Springer, Berlin, 2008. German edition: Stochastische Geometrie, Teubner, 2000). In Ambjørn et al. (Quantum Geometry Cambridge University Press, Cambridge, 1997) you find a beautiful account from the physical point of view. More recent developments in this direction can be found in Ambjørn et al. ("Quantum gravity as sum over spacetimes", Lect. Notes Phys. 807. Springer, Heidelberg, 2010). After an informal axiomatic introduction into the conceptual foundations of Regge's approach the first lecture recalls the concepts and notations used. It presents the fundamental zero-infinity law of stochastic geometry and the construction of cluster processes based on it. The second lecture presents the main mathematical object, i.e. Poisson-Delaunay surfaces possessing an intrinsic random metric structure. The third and fourth lectures discuss their ergodic behaviour and present the two-dimensional Regge model of pure simplicial quantum gravity. We terminate with the formulation of basic open problems. Proofs are given in detail only in a few cases. In general the main ideas are developed. Sufficiently complete references are given.

  15. RIGOROUS GEOREFERENCING OF ALSAT-2A PANCHROMATIC AND MULTISPECTRAL IMAGERY

    Directory of Open Access Journals (Sweden)

    I. Boukerch

    2013-04-01

    Full Text Available The exploitation of the full geometric capabilities of the High-Resolution Satellite Imagery (HRSI, require the development of an appropriate sensor orientation model. Several authors studied this problem; generally we have two categories of geometric models: physical and empirical models. Based on the analysis of the metadata provided with ALSAT-2A, a rigorous pushbroom camera model can be developed. This model has been successfully applied to many very high resolution imagery systems. The relation between the image and ground coordinates by the time dependant collinearity involving many coordinates systems has been tested. The interior orientation parameters must be integrated in the model, the interior parameters can be estimated from the viewing angles corresponding to the pointing directions of any detector, these values are derived from cubic polynomials provided in the metadata. The developed model integrates all the necessary elements with 33 unknown. All the approximate values of the 33 unknowns parameters may be derived from the informations contained in the metadata files provided with the imagery technical specifications or they are simply fixed to zero, so the condition equation is linearized and solved using SVD in a least square sense in order to correct the initial values using a suitable number of well-distributed GCPs. Using Alsat-2A images over the town of Toulouse in the south west of France, three experiments are done. The first is about 2D accuracy analysis using several sets of parameters. The second is about GCPs number and distribution. The third experiment is about georeferencing multispectral image by applying the model calculated from panchromatic image.

  16. A rigorous derivation of gravitational self-force

    International Nuclear Information System (INIS)

    Gralla, Samuel E; Wald, Robert M

    2008-01-01

    There is general agreement that the MiSaTaQuWa equations should describe the motion of a 'small body' in general relativity, taking into account the leading order self-force effects. However, previous derivations of these equations have made a number of ad hoc assumptions and/or contain a number of unsatisfactory features. For example, all previous derivations have invoked, without proper justification, the step of 'Lorenz gauge relaxation', wherein the linearized Einstein equation is written in the form appropriate to the Lorenz gauge, but the Lorenz gauge condition is then not imposed-thereby making the resulting equations for the metric perturbation inequivalent to the linearized Einstein equations. (Such a 'relaxation' of the linearized Einstein equations is essential in order to avoid the conclusion that 'point particles' move on geodesics.) In this paper, we analyze the issue of 'particle motion' in general relativity in a systematic and rigorous way by considering a one-parameter family of metrics, g ab (λ), corresponding to having a body (or black hole) that is 'scaled down' to zero size and mass in an appropriate manner. We prove that the limiting worldline of such a one-parameter family must be a geodesic of the background metric, g ab (λ = 0). Gravitational self-force-as well as the force due to coupling of the spin of the body to curvature-then arises as a first-order perturbative correction in λ to this worldline. No assumptions are made in our analysis apart from the smoothness and limit properties of the one-parameter family of metrics, g ab (λ). Our approach should provide a framework for systematically calculating higher order corrections to gravitational self-force, including higher multipole effects, although we do not attempt to go beyond first-order calculations here. The status of the MiSaTaQuWa equations is explained

  17. Rigorous covariance propagation of geoid errors to geodetic MDT estimates

    Science.gov (United States)

    Pail, R.; Albertella, A.; Fecher, T.; Savcenko, R.

    2012-04-01

    The mean dynamic topography (MDT) is defined as the difference between the mean sea surface (MSS) derived from satellite altimetry, averaged over several years, and the static geoid. Assuming geostrophic conditions, from the MDT the ocean surface velocities as important component of global ocean circulation can be derived from it. Due to the availability of GOCE gravity field models, for the very first time MDT can now be derived solely from satellite observations (altimetry and gravity) down to spatial length-scales of 100 km and even below. Global gravity field models, parameterized in terms of spherical harmonic coefficients, are complemented by the full variance-covariance matrix (VCM). Therefore, for the geoid component a realistic statistical error estimate is available, while the error description of the altimetric component is still an open issue and is, if at all, attacked empirically. In this study we make the attempt to perform, based on the full gravity VCM, rigorous error propagation to derived geostrophic surface velocities, thus also considering all correlations. For the definition of the static geoid we use the third release of the time-wise GOCE model, as well as the satellite-only combination model GOCO03S. In detail, we will investigate the velocity errors resulting from the geoid component in dependence of the harmonic degree, and the impact of using/no using covariances on the MDT errors and its correlations. When deriving an MDT, it is spectrally filtered to a certain maximum degree, which is usually driven by the signal content of the geoid model, by applying isotropic or non-isotropic filters. Since this filtering is acting also on the geoid component, the consistent integration of this filter process into the covariance propagation shall be performed, and its impact shall be quantified. The study will be performed for MDT estimates in specific test areas of particular oceanographic interest.

  18. Is mindfulness research methodology improving over time? A systematic review.

    Directory of Open Access Journals (Sweden)

    Simon B Goldberg

    Full Text Available Despite an exponential growth in research on mindfulness-based interventions, the body of scientific evidence supporting these treatments has been criticized for being of poor methodological quality.The current systematic review examined the extent to which mindfulness research demonstrated increased rigor over the past 16 years regarding six methodological features that have been highlighted as areas for improvement. These feature included using active control conditions, larger sample sizes, longer follow-up assessment, treatment fidelity assessment, and reporting of instructor training and intent-to-treat (ITT analyses.We searched PubMed, PsychInfo, Scopus, and Web of Science in addition to a publically available repository of mindfulness studies.Randomized clinical trials of mindfulness-based interventions for samples with a clinical disorder or elevated symptoms of a clinical disorder listed on the American Psychological Association's list of disorders with recognized evidence-based treatment.Independent raters screened 9,067 titles and abstracts, with 303 full text reviews. Of these, 171 were included, representing 142 non-overlapping samples.Across the 142 studies published between 2000 and 2016, there was no evidence for increases in any study quality indicator, although changes were generally in the direction of improved quality. When restricting the sample to those conducted in Europe and North America (continents with the longest history of scientific research in this area, an increase in reporting of ITT analyses was found. When excluding an early, high-quality study, improvements were seen in sample size, treatment fidelity assessment, and reporting of ITT analyses.Taken together, the findings suggest modest adoption of the recommendations for methodological improvement voiced repeatedly in the literature. Possible explanations for this and implications for interpreting this body of research and conducting future studies are

  19. Analysis of the penumbra enlargement in lung versus the Quality Index of photon beams: A methodology to check the dose calculation algorithm

    International Nuclear Information System (INIS)

    Tsiakalos, Miltiadis F.; Theodorou, Kiki; Kappas, Constantin; Zefkili, Sofia; Rosenwold, Jean-Claude

    2004-01-01

    It is well known that considerable underdosage can occur at the edges of a tumor inside the lung because of the degradation of penumbra due to lack of lateral electronic equilibrium. Although present even at smaller energies, this phenomenon is more pronounced for higher energies. Apart from Monte Carlo calculation, most of the existing Treatment Planning Systems (TPSs) cannot deal at all, or with acceptable accuracy, with this effect. A methodology has been developed for assessing the dose calculation algorithms in the lung region where lateral electronic disequilibrium exists, based on the Quality Index (QI) of the incident beam. A phantom, consisting of layers of polystyrene and lung material, has been irradiated using photon beams of 4, 6, 15, and 20 MV. The cross-plane profiles of each beam for 5x5, 10x10, and 25x10 fields have been measured at the middle of the phantom with the use of films. The penumbra (20%-80%) and fringe (50%-90%) enlargement was measured and the ratio of the widths for the lung to that of polystyrene was defined as the Correction Factor (CF). Monte Carlo calculations in the two phantoms have also been performed for energies of 6, 15, and 20 MV. Five commercial TPS's algorithms were tested for their ability to predict the penumbra and fringe enlargement. A linear relationship has been found between the QI of the beams and the CF of the penumbra and fringe enlargement for all the examined fields. Monte Carlo calculations agree very well (less than 1% difference) with the film measurements. The CF values range between 1.1 for 4 MV (QI 0.620) and 2.28 for 20 MV (QI 0.794). Three of the tested TPS's algorithms could not predict any enlargement at all for all energies and all fields and two of them could predict the penumbra enlargement to some extent. The proposed methodology can help any user or developer to check the accuracy of its algorithm for lung cases, based on a simple phantom geometry and the QI of the incident beam. This check is

  20. Design and Implementation of the Harvard Fellowship in Patient Safety and Quality.

    Science.gov (United States)

    Gandhi, Tejal K; Abookire, Susan A; Kachalia, Allen; Sands, Kenneth; Mort, Elizabeth; Bommarito, Grace; Gagne, Jane; Sato, Luke; Weingart, Saul N

    2016-01-01

    The Harvard Fellowship in Patient Safety and Quality is a 2-year physician-oriented training program with a strong operational orientation, embedding trainees in the quality departments of participating hospitals. It also integrates didactic and experiential learning and offers the option of obtaining a master's degree in public health. The program focuses on methodologically rigorous improvement and measurement, with an emphasis on the development and implementation of innovative practice. The operational orientation is intended to foster the professional development of future quality and safety leaders. The purpose of this article is to describe the design and development of the fellowship. © The Author(s) 2014.

  1. Experimental evaluation of rigor mortis. III. Comparative study of the evolution of rigor mortis in different sized muscle groups in rats.

    Science.gov (United States)

    Krompecher, T; Fryc, O

    1978-01-01

    The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.

  2. How Individual Scholars Can Reduce the Rigor-Relevance Gap in Management Research

    OpenAIRE

    Wolf, Joachim; Rosenberg, Timo

    2012-01-01

    This paper discusses a number of avenues management scholars could follow to reduce the existing gap between scientific rigor and practical relevance without relativizing the importance of the first goal dimension. Such changes are necessary because many management studies do not fully exploit the possibilities to increase their practical relevance while maintaining scientific rigor. We argue that this rigor-relevance gap is not only the consequence of the currently prevailing institutional c...

  3. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    Science.gov (United States)

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  4. Quality

    International Nuclear Information System (INIS)

    Burnett, N.; Jeffries, J.; Mach, J.; Robson, M.; Pajot, D.; Harrigan, J.; Lebsack, T.; Mullen, D.; Rat, F.; Theys, P.

    1993-01-01

    What is quality? How do you achieve it? How do you keep it once you have got it. The answer for industry at large is the three-step hierarchy of quality control, quality assurance and Total quality Management. An overview is given of the history of quality movement, illustrated with examples from Schlumberger operations, as well as the oil industry's approach to quality. An introduction of the Schlumberger's quality-associated ClientLink program is presented. 15 figs., 4 ills., 16 refs

  5. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  6. Imagination and rigor essays on Eduardo R Caianiello's scientific heritage

    CERN Document Server

    Termini, Settimo

    2006-01-01

    The aim of this Volume of scientific essays is twofold. From one side, by remembering the scientific figure of Eduardo R. Caianiello, it aims at focusing his outstanding contributions - from theoretical physics to cybernetics - which after so many years still represent occasion of innovative paths to be fruitfully followed. It must be stressed the contribution that his interdisciplinary methodology can still be of great help in affording and solving present day complex problems. On the other side, it aims at pinpointing with the help of the scientists contributing to the Volume - some crucial problems in present day research in the fields of interest of Eduardo Caianiello and which are still among the main lines of investigation of some of the Istitutes founded by Eduardo (Istituto di Cibernetica del CNR, IIAS, etc).

  7. Measuring physical neighborhood quality related to health.

    Science.gov (United States)

    Rollings, Kimberly A; Wells, Nancy M; Evans, Gary W

    2015-04-29

    Although sociodemographic factors are one aspect of understanding the effects of neighborhood environments on health, equating neighborhood quality with socioeconomic status ignores the important role of physical neighborhood attributes. Prior work on neighborhood environments and health has relied primarily on level of socioeconomic disadvantage as the indicator of neighborhood quality without attention to physical neighborhood quality. A small but increasing number of studies have assessed neighborhood physical characteristics. Findings generally indicate that there is an association between living in deprived neighborhoods and poor health outcomes, but rigorous evidence linking specific physical neighborhood attributes to particular health outcomes is lacking. This paper discusses the methodological challenges and limitations of measuring physical neighborhood environments relevant to health and concludes with proposed directions for future work.

  8. ATP, IMP, and glycogen in cod muscle at onset and during development of rigor mortis depend on the sampling location

    DEFF Research Database (Denmark)

    Cappeln, Gertrud; Jessen, Flemming

    2002-01-01

    Variation in glycogen, ATP, and IMP contents within individual cod muscles were studied in ice stored fish during the progress of rigor mortis. Rigor index was determined before muscle samples for chemical analyzes were taken at 16 different positions on the fish. During development of rigor......, the contents of glycogen and ATP decreased differently in relation to rigor index depending on sampling location. Although fish were considered to be in strong rigor according to the rigor index method, parts of the muscle were not in rigor as high ATP concentrations were found in dorsal and tall muscle....

  9. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    Science.gov (United States)

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (prigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  10. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    Science.gov (United States)

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (psalting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  11. Rigorous bounds on the free energy of electron-phonon models

    NARCIS (Netherlands)

    Raedt, Hans De; Michielsen, Kristel

    1997-01-01

    We present a collection of rigorous upper and lower bounds to the free energy of electron-phonon models with linear electron-phonon interaction. These bounds are used to compare different variational approaches. It is shown rigorously that the ground states corresponding to the sharpest bounds do

  12. The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools

    Science.gov (United States)

    Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia

    2016-01-01

    Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…

  13. Moving beyond Data Transcription: Rigor as Issue in Representation of Digital Literacies

    Science.gov (United States)

    Hagood, Margaret Carmody; Skinner, Emily Neil

    2015-01-01

    Rigor in qualitative research has been based upon criteria of credibility, dependability, confirmability, and transferability. Drawing upon articles published during our editorship of the "Journal of Adolescent & Adult Literacy," we illustrate how the use of digital data in research study reporting may enhance these areas of rigor,…

  14. As good as it gets? A meta-analysis and systematic review of methodological quality of heart rate variability studies in functional somatic disorders

    NARCIS (Netherlands)

    Tak, L.M.; Riese, H.; de Bock, G.H.; Manoharan, A.; Kok, I.C.; Rosmalen, J.G.M.

    2009-01-01

    Autonomic nervous system (ANS) dysfunction is a potential mechanism connecting psychosocial stress to functional somatic disorders (FSD), such as chronic fatigue syndrome, fibromyalgia and irritable bowel syndrome. We present the first meta-analysis and systematic review of methodological study

  15. Onset of rigor mortis is earlier in red muscle than in white muscle.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H

    2000-01-01

    Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.

  16. Karl Pearson and eugenics: personal opinions and scientific rigor.

    Science.gov (United States)

    Delzell, Darcie A P; Poliak, Cathy D

    2013-09-01

    The influence of personal opinions and biases on scientific conclusions is a threat to the advancement of knowledge. Expertise and experience does not render one immune to this temptation. In this work, one of the founding fathers of statistics, Karl Pearson, is used as an illustration of how even the most talented among us can produce misleading results when inferences are made without caution or reference to potential bias and other analysis limitations. A study performed by Pearson on British Jewish schoolchildren is examined in light of ethical and professional statistical practice. The methodology used and inferences made by Pearson and his coauthor are sometimes questionable and offer insight into how Pearson's support of eugenics and his own British nationalism could have potentially influenced his often careless and far-fetched inferences. A short background into Pearson's work and beliefs is provided, along with an in-depth examination of the authors' overall experimental design and statistical practices. In addition, portions of the study regarding intelligence and tuberculosis are discussed in more detail, along with historical reactions to their work.

  17. High and low rigor temperature effects on sheep meat tenderness and ageing.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (Prigor at each ageing time were significantly different (Prigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  18. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  19. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  20. Methodological adequacy of articles published in two open-access Brazilian cardiology periodicals.

    Science.gov (United States)

    Macedo, Cristiane Rufino; Silva, Davi Leite da; Puga, Maria Eduarda

    2010-01-01

    The use of rigorous scientific methods has contributed towards developing scientific articles of excellent methodological quality. This has made it possible to promote their citation and increase the impact factor. Brazilian periodicals have had to adapt to certain quality standards demanded by these indexing organizations, such as the content and the number of original articles published in each issue. This study aimed to evaluate the methodological adequacy of two Brazilian periodicals within the field of cardiology that are indexed in several databases and freely accessible through the Scientific Electronic Library Online (SciELO), and which are now indexed by the Web of Science (Institute for Scientific Information, ISI). Descriptive study at Brazilian Cochrane Center. All the published articles were evaluated according to merit assessment (content) and form assessment (performance). Ninety-six percent of the articles analyzed presented study designs that were adequate for answering the objectives. These two Brazilian periodicals within the field of cardiology published methodologically adequate articles, since they followed the quality standards. Thus, these periodicals can be considered both for consultation and as vehicles for publishing future articles. For further analyses, it is essential to apply other indicators of scientific activity such as bibliometrics, which evaluates quantitative aspects of the production, dissemination and use of information, and scientometrics, which is also concerned with the development of science policies, within which it is often superimposed on bibliometrics.

  1. Sensory quality and onset of rigor mortis for farmed turbot under various post slaughter conditions

    NARCIS (Netherlands)

    Schelvis-Smit, A.A.M.; Veldman, M.; Kruijt, A.W.; Vis, van de J.W.

    2006-01-01

    As is the case for most farmed fish, the production of turbot is targeting the fresh markets established in Europe and Asia. Usually turbot is packed whole, dead or alive and transported directly to the market. On some occasions the fish are bled and gutted prior to delivery.

  2. Three on a Match: Gary A. Olson on Rigor, Reliability, and Quality Control in Digital Scholarship

    Science.gov (United States)

    Jensen, Kyle

    2009-01-01

    This interview examines the relationship between digital scholarship and the politics of higher education. In doing so, it advances a series of recommendations that aim to help digital scholars and digital scholarship achieve an increased level of stature in the academic community.

  3. Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications

    Directory of Open Access Journals (Sweden)

    Vassilis Gikas

    2016-08-01

    Full Text Available With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS, Inertial Measurement Unit (IMU and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness capabilities (i.e., potential and limitations based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android and iPhone 5s (iOS. Our findings indicate that the deviation of the smartphone locations from ground truth (trueness deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of

  4. Towards a New Basel Accord with More Rigorous Settlements

    Directory of Open Access Journals (Sweden)

    Petru PRUNEA

    2010-09-01

    Full Text Available The recent financial crisis made the banking sector more vulnerable to shocks. The system was characterised by weaknesses: too much leverage in the banking; not enough high quality capital to absorb losses and excessive credit growth based on underwriting standards and under pricing of liquidity. This article is about a new accord Basel III and the view of this framework. Basel III will be finalized before November 2010, and will be implemented by the end of 2012. Basel III is going to be implemented in the United States. All G–20 countries should adopt progressively this capital framework. The Basel Committee on Banking Supervision and national authorities should develop and agree a global framework for promoting a stronger liquidity in financial institutions. The reform program is to raise the resilience of the banking sector through promoting more sustainable growth, both in the near term and over the long therm. The initiatives of Basel Committee will develop a set of reforms based on four steps: public consultation, impact assessment, overall calibration and macroeconomic impact assessment over the transition period.

  5. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  6. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  7. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  8. On Robust Methodologies for Managing Public Health Care Systems

    Directory of Open Access Journals (Sweden)

    Shastri L. Nimmagadda

    2014-01-01

    Full Text Available Authors focus on ontology-based multidimensional data warehousing and mining methodologies, addressing various issues on organizing, reporting and documenting diabetic cases and their associated ailments, including causalities. Map and other diagnostic data views, depicting similarity and comparison of attributes, extracted from warehouses, are used for understanding the ailments, based on gender, age, geography, food-habits and other hereditary event attributes. In addition to rigor on data mining and visualization, an added focus is on values of interpretation of data views, from processed full-bodied diagnosis, subsequent prescription and appropriate medications. The proposed methodology, is a robust back-end application, for web-based patient-doctor consultations and e-Health care management systems through which, billions of dollars spent on medical services, can be saved, in addition to improving quality of life and average life span of a person. Government health departments and agencies, private and government medical practitioners including social welfare organizations are typical users of these systems.

  9. Developing a Mental Health eClinic to Improve Access to and Quality of Mental Health Care for Young People: Using Participatory Design as Research Methodologies.

    Science.gov (United States)

    Ospina-Pinillos, Laura; Davenport, Tracey A; Ricci, Cristina S; Milton, Alyssa C; Scott, Elizabeth M; Hickie, Ian B

    2018-05-28

    Each year, many young Australians aged between 16 and 25 years experience a mental health disorder, yet only a small proportion access services and even fewer receive timely and evidence-based treatments. Today, with ever-increasing access to the Internet and use of technology, the potential to provide all young people with access (24 hours a day, 7 days a week) to the support they require to improve their mental health and well-being is promising. The aim of this study was to use participatory design (PD) as research methodologies with end users (young people aged between 16 and 25 years and youth health professionals) and our research team to develop the Mental Health eClinic (a Web-based mental health clinic) to improve timely access to, and better quality, mental health care for young people across Australia. A research and development (R&D) cycle for the codesign and build of the Mental Health eClinic included several iterative PD phases: PD workshops; translation of knowledge and ideas generated during workshops to produce mockups of webpages either as hand-drawn sketches or as wireframes (simple layout of a webpage before visual design and content is added); rapid prototyping; and one-on-one consultations with end users to assess the usability of the alpha build of the Mental Health eClinic. Four PD workshops were held with 28 end users (young people n=18, youth health professionals n=10) and our research team (n=8). Each PD workshop was followed by a knowledge translation session. At the conclusion of this cycle, the alpha prototype was built, and one round of one-on-one end user consultation sessions was conducted (n=6; all new participants, young people n=4, youth health professionals n=2). The R&D cycle revealed the importance of five key components for the Mental Health eClinic: a home page with a visible triage system for those requiring urgent help; a comprehensive online physical and mental health assessment; a detailed dashboard of results; a

  10. Differential rigor development in red and white muscle revealed by simultaneous measurement of tension and stiffness.

    Science.gov (United States)

    Kobayashi, Masahiko; Takemori, Shigeru; Yamaguchi, Maki

    2004-02-10

    Based on the molecular mechanism of rigor mortis, we have proposed that stiffness (elastic modulus evaluated with tension response against minute length perturbations) can be a suitable index of post-mortem rigidity in skeletal muscle. To trace the developmental process of rigor mortis, we measured stiffness and tension in both red and white rat skeletal muscle kept in liquid paraffin at 37 and 25 degrees C. White muscle (in which type IIB fibres predominate) developed stiffness and tension significantly more slowly than red muscle, except for soleus red muscle at 25 degrees C, which showed disproportionately slow rigor development. In each of the examined muscles, stiffness and tension developed more slowly at 25 degrees C than at 37 degrees C. In each specimen, tension always reached its maximum level earlier than stiffness, and then decreased more rapidly and markedly than stiffness. These phenomena may account for the sequential progress of rigor mortis in human cadavers.

  11. Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).

    Science.gov (United States)

    Suzutani, T; Ishibashi, H; Takatori, T

    1978-11-01

    The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.

  12. 75 FR 29732 - Career and Technical Education Program-Promoting Rigorous Career and Technical Education Programs...

    Science.gov (United States)

    2010-05-27

    ... rigorous knowledge and skills in English- language arts and mathematics that employers and colleges expect... specialists and to access the student outcome data needed to meet annual evaluation and reporting requirements...

  13. Rigorous derivation from Landau-de Gennes theory to Ericksen-Leslie theory

    OpenAIRE

    Wang, Wei; Zhang, Pingwen; Zhang, Zhifei

    2013-01-01

    Starting from Beris-Edwards system for the liquid crystal, we present a rigorous derivation of Ericksen-Leslie system with general Ericksen stress and Leslie stress by using the Hilbert expansion method.

  14. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  15. Practical demonstration of urban air quality simulation models. Part I. A report of the NATO/CCMS pilot study on air pollution assessment methodology and modeling

    Energy Technology Data Exchange (ETDEWEB)

    1980-08-01

    Report shows cooperation of NATO member nations in developing standardized methods for emissions data storage and retrieval to support requirements in conduct of national air quality management programs. Shows cooperation in developing standardized techniques for projecting emissions and predicting future ambient air quality.

  16. Quality systems in veterinary diagnostics laboratories.

    Science.gov (United States)

    de Branco, Freitas Maia L M

    2007-01-01

    Quality assurance of services provided by veterinary diagnostics laboratories is a fundamental element promoted by international animal health organizations to establish trust, confidence and transparency needed for the trade of animals and their products at domestic and international levels. It requires, among other things, trained personnel, consistent and rigorous methodology, choice of suitable methods as well as appropriate calibration and traceability procedures. An important part of laboratory quality management is addressed by ISO/IEC 17025, which aims to facilitate cooperation among laboratories and their associated parties by assuring the generation of credible and consistent information derived from analytical results. Currently, according to OIE recommendation, veterinary diagnostics laboratories are only subject to voluntary compliance with standard ISO/IEC 17025; however, it is proposed here that OIE reference laboratories and collaboration centres strongly consider its adoption.

  17. Back-and-Forth Methodology for Objective Voice Quality Assessment: From/to Expert Knowledge to/from Automatic Classification of Dysphonia

    Science.gov (United States)

    Fredouille, Corinne; Pouchoulin, Gilles; Ghio, Alain; Revis, Joana; Bonastre, Jean-François; Giovanni, Antoine

    2009-12-01

    This paper addresses voice disorder assessment. It proposes an original back-and-forth methodology involving an automatic classification system as well as knowledge of the human experts (machine learning experts, phoneticians, and pathologists). The goal of this methodology is to bring a better understanding of acoustic phenomena related to dysphonia. The automatic system was validated on a dysphonic corpus (80 female voices), rated according to the GRBAS perceptual scale by an expert jury. Firstly, focused on the frequency domain, the classification system showed the interest of 0-3000 Hz frequency band for the classification task based on the GRBAS scale. Later, an automatic phonemic analysis underlined the significance of consonants and more surprisingly of unvoiced consonants for the same classification task. Submitted to the human experts, these observations led to a manual analysis of unvoiced plosives, which highlighted a lengthening of VOT according to the dysphonia severity validated by a preliminary statistical analysis.

  18. Back-and-Forth Methodology for Objective Voice Quality Assessment: From/to Expert Knowledge to/from Automatic Classification of Dysphonia

    Directory of Open Access Journals (Sweden)

    Corinne Fredouille

    2009-01-01

    Full Text Available This paper addresses voice disorder assessment. It proposes an original back-and-forth methodology involving an automatic classification system as well as knowledge of the human experts (machine learning experts, phoneticians, and pathologists. The goal of this methodology is to bring a better understanding of acoustic phenomena related to dysphonia. The automatic system was validated on a dysphonic corpus (80 female voices, rated according to the GRBAS perceptual scale by an expert jury. Firstly, focused on the frequency domain, the classification system showed the interest of 0–3000 Hz frequency band for the classification task based on the GRBAS scale. Later, an automatic phonemic analysis underlined the significance of consonants and more surprisingly of unvoiced consonants for the same classification task. Submitted to the human experts, these observations led to a manual analysis of unvoiced plosives, which highlighted a lengthening of VOT according to the dysphonia severity validated by a preliminary statistical analysis.

  19. Reduction of Complications of Local Anaesthesia in Dental Healthcare Setups by Application of the Six Sigma Methodology: A Statistical Quality Improvement Technique.

    Science.gov (United States)

    Akifuddin, Syed; Khatoon, Farheen

    2015-12-01

    Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.

  20. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  1. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  2. Teaching mathematical word problem solving: the quality of evidence for strategy instruction priming the problem structure.

    Science.gov (United States)

    Jitendra, Asha K; Petersen-Brown, Shawna; Lein, Amy E; Zaslofsky, Anne F; Kunkel, Amy K; Jung, Pyung-Gang; Egan, Andrea M

    2015-01-01

    This study examined the quality of the research base related to strategy instruction priming the underlying mathematical problem structure for students with learning disabilities and those at risk for mathematics difficulties. We evaluated the quality of methodological rigor of 18 group research studies using the criteria proposed by Gersten et al. and 10 single case design (SCD) research studies using criteria suggested by Horner et al. and the What Works Clearinghouse. Results indicated that 14 group design studies met the criteria for high-quality or acceptable research, whereas SCD studies did not meet the standards for an evidence-based practice. Based on these findings, strategy instruction priming the mathematics problem structure is considered an evidence-based practice using only group design methodological criteria. Implications for future research and for practice are discussed. © Hammill Institute on Disabilities 2013.

  3. Impact of a quality improvement program on care and outcomes for children with asthma.

    Science.gov (United States)

    Homer, Charles J; Forbes, Peter; Horvitz, Lisa; Peterson, Laura E; Wypij, David; Heinrich, Patricia

    2005-05-01

    To test a quality improvement intervention, a learning collaborative based on the Institute for Healthcare Improvement's Breakthrough Series methodology, specifically intended to improve care and outcomes for patients with childhood asthma. Randomized trial in primary care practices. Practices in greater Boston, Mass, and greater Detroit, Mich. Forty-three practices, with 13 878 pediatric patients with asthma, randomized to intervention and control groups. Intervention Participation in a learning collaborative project based on the Breakthrough Series methodology of continuous quality improvement. Change from baseline in the proportion of children with persistent asthma who received appropriate medication therapy for asthma, and in the proportion of children whose parent received a written management plan for their child's asthma, as determined by telephone interviews with parents of 631 children. After adjusting for state, practice size, child age, sex, and within-practice clustering, no overall effect of the intervention was found. This methodologically rigorous assessment of a widely used quality improvement technique did not demonstrate a significant effect on processes or outcomes of care for children with asthma. Potential deficiencies in program implementation, project duration, sample selection, and data sources preclude making the general inference that this type of improvement program is ineffective. Additional rigorous studies should be undertaken under more optimal settings to assess the efficacy of this method for improving care.

  4. Looking Inward: Philosophical and Methodological Perspectives on Phenomenological Self-Reflection.

    Science.gov (United States)

    Pool, Natalie M

    2018-07-01

    Engaging in early and ongoing self-reflection during interpretive phenomenological research is critical for ensuring trustworthiness or rigor. However, the lack of guidelines and clarity about the role of self-reflection in this methodology creates both theoretical and procedural confusion. The purpose of this article is to describe key philosophical underpinnings, characteristics, and hallmarks of the process of self-reflection in interpretive phenomenological investigation and to provide a list of guidelines that facilitate this process. Excerpts from an interpretive phenomenological study are used to illustrate characteristics of quality self-reflection. The guidelines are intended to be particularly beneficial for novice researchers who may find self-reflective writing to be daunting and unclear. Facilitating use of self-reflection may strengthen both the interpretive phenomenological body of work as well as that of all qualitative research.

  5. Development of a quality management system (QMS) for borehole investigations. (2) Evaluation of applicability of QMS methodology for the hydrochemical dataset

    International Nuclear Information System (INIS)

    Kunimaru, Takanori; Ota, Kunio; Amano, Kenji; Alexander, W. Russell

    2011-01-01

    An appropriate QMS, which is among the first tools required for repository site characterisation, will save on effort by reducing errors and the requirement to resample and reanalyse - but this can only be guaranteed by continuously assessing if the system is truly fit-for-purpose and amending it as necessary based on the practical experience of the end-users on-site. A QA audit of hydrochemical datasets for boreholes HDB-1-11 from Horonobe URL project by JAEA has been carried out by the application of a formal QA analysis which is based on the methodology previously employed for groundwaters during the recent site characterisation programme in Sweden. This methodology has been successfully applied to the groundwaters of the fractured crystalline rocks of the Fennoscandian Shield and has now been adapted and applied to some of the ground- and porewaters of the Horonobe URL area. This paper will present this system in the context of the Japanese national programme and elucidate improvements made during hands-on application of the borehole investigation QMS. Further improvements foreseen for the future will also be discussed with a view of removing inter-operator variability as much as is possible. Only then can confidence by placed in URL project or repository site hydrochemical datasets. (author)

  6. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    Science.gov (United States)

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  7. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  8. Innovative Mixed-Methods Research: Moving beyond Design Technicalities to Epistemological and Methodological Realizations

    Science.gov (United States)

    Riazi, A. Mehdi

    2016-01-01

    Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…

  9. The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.

    Science.gov (United States)

    Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko

    2013-11-01

    Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. A proposed methodology for impact assessment of air quality traffic-related measures: The case of PM2.5 in Beijing.

    Science.gov (United States)

    Fontes, Tânia; Li, Peilin; Barros, Nelson; Zhao, Pengjun

    2018-08-01

    Air quality traffic-related measures have been implemented worldwide to control the pollution levels of urban areas. Although some of those measures are claiming environmental improvements, few studies have checked their real impact. In fact, quantitative estimates are often focused on reducing emissions, rather than on evaluating the actual measures' effect on air quality. Even when air quality studies are conducted, results are frequently unclear. In order to properly assess the real impact on air quality of traffic-related measures, a statistical method is proposed. The method compares the pollutant concentration levels observed after the implementation of a measure with the concentration values of the previous year. Short- and long-term impact is assessed considering not only their influence on the average pollutant concentration, but also on its maximum level. To control the effect of the main confounding factors, only the days with similar environmental conditions are analysed. The changeability of the key meteorological variables that affect the transport and dispersion of the pollutant studied are used to identify and group the days categorized as similar. Resemblance of the pollutants' concentration of the previous day is also taken into account. The impact of the road traffic measures on the air pollutants' concentration is then checked for those similar days using specific statistical functions. To evaluate the proposed method, the impact on PM 2.5 concentrations of two air quality traffic-related measures (M1 and M2) implemented in the city of Beijing are taken into consideration: M1 was implemented in 2009, restricting the circulation of yellow-labelled vehicles, while M2 was implemented in 2014, restricting the circulation of heavy-duty vehicles. To compare the results of each measure, a time-period when these measures were not applied is used as case-control. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Regional Issue Identification and Assessment Program (RIIA). A methodology for analyzing the short-term air quality impacts of new power plants: issue paper 5

    Energy Technology Data Exchange (ETDEWEB)

    Lipfert, F.W.

    1979-07-01

    A simplified methodology is presented, based on Gaussian plume relationships, which could be used to assess regulatory constraints and vegetation damage for new power plants. Data input requirements include: (a) power plant size (MW), (b) fuel type, sulfur content, and level of control, and (c) nearby terrain elevation difference, with respect to stack base, for critical receptors. Based on sample calculations the 24-hour PSD increment is seen to be the most restrictive, using the ASME dispersion coefficients. An 800-MW plant (which is close to the optimum size from cost and reliability considerations according to a recent analysis), could be forced to reduce emissions below the levels assumed in this paper if adverse conditions were encountered. For example, terrain features higher than about 300 m at the critical distance from the plant could be restrictive, as could sufficiently persistent winds that would confine 24-hour plume spreading to <10/sup 0/.

  12. Editorial: Research as Practice: On Critical Methodologies. Qualitative Research in Psychology Sage Publications Ltd., GB ISSN: 1478-0887 Elektronisk ISSN: 1478-0895 FI quality: 2008: 1

    DEFF Research Database (Denmark)

    Nissen, Morten

    2007-01-01

    Table of contents: Motzkau and Jefferson: Research as Practice: On Critical Methodologies (editorial) Jefferson and Huniche: (Re) Searching for persons in practice: Field based methods for critical psychological practice research Khawaja and Mørck: Researcher positioning - Muslim ‘otherness......' and beyond Hasse and Trentemøller: The method of culture contrast Nissen: Objectification and Prototype Lee: Researching Children's Diets in England: Critical Methods in a Consumer Society Nolas: Between the ideal and the real: using ethnography as a way of extending our language of change Motzkau......: The Semiotic of Accusation: Thinking about deconstruction, development, the critique of practice and the practice of critique Zavos and Biglia: Embodying Feminist Research: learning from action research, political practices and collective knowledge...

  13. Improved robotic stereotactic body radiation therapy plan quality and planning efficacy for organ-confined prostate cancer utilizing overlap-volume histogram-driven planning methodology

    International Nuclear Information System (INIS)

    Wu, Binbin; Pang, Dalong; Lei, Siyuan; Gatti, John; Tong, Michael; McNutt, Todd; Kole, Thomas; Dritschilo, Anatoly; Collins, Sean

    2014-01-01

    Background and purpose: This study is to determine if the overlap-volume histogram (OVH)-driven planning methodology can be adapted to robotic SBRT (CyberKnife Robotic Radiosurgery System) to further minimize the bladder and rectal doses achieved in plans manually-created by clinical planners. Methods and materials: A database containing clinically-delivered, robotic SBRT plans (7.25 Gy/fraction in 36.25 Gy) of 425 patients with localized prostate cancer was used as a cohort to establish an organ’s distance-to-dose model. The OVH-driven planning methodology was refined by adding the PTV volume factor to counter the target’s dose fall-off effect and incorporated into Multiplan to automate SBRT planning. For validation, automated plans (APs) for 12 new patients were generated, and their achieved dose/volume values were compared to the corresponding manually-created, clinically-delivered plans (CPs). A two-sided, Wilcoxon rank-sum test was used for statistical comparison with a significance level of p < 0.05. Results: PTV’s V(36.25 Gy) was comparable: 95.6% in CPs comparing to 95.1% in APs (p = 0.2). On average, the refined approach lowered V(18.12 Gy) to the bladder and rectum by 8.2% (p < 0.05) and 6.4% (p = 0.14). A physician confirmed APs were clinically acceptable. Conclusions: The improvements in APs could further reduce toxicities observed in SBRT for organ-confined prostate cancer

  14. Mathematical framework for fast and rigorous track fit for the ZEUS detector

    Energy Technology Data Exchange (ETDEWEB)

    Spiridonov, Alexander

    2008-12-15

    In this note we present a mathematical framework for a rigorous approach to a common track fit for trackers located in the inner region of the ZEUS detector. The approach makes use of the Kalman filter and offers a rigorous treatment of magnetic field inhomogeneity, multiple scattering and energy loss. We describe mathematical details of the implementation of the Kalman filter technique with a reduced amount of computations for a cylindrical drift chamber, barrel and forward silicon strip detectors and a forward straw drift chamber. Options with homogeneous and inhomogeneous field are discussed. The fitting of tracks in one ZEUS event takes about of 20ms on standard PC. (orig.)

  15. Electrocardiogram artifact caused by rigors mimicking narrow complex tachycardia: a case report.

    Science.gov (United States)

    Matthias, Anne Thushara; Indrakumar, Jegarajah

    2014-02-04

    The electrocardiogram (ECG) is useful in the diagnosis of cardiac and non-cardiac conditions. Rigors due to shivering can cause electrocardiogram artifacts mimicking various cardiac rhythm abnormalities. We describe an 80-year-old Sri Lankan man with an abnormal electrocardiogram mimicking narrow complex tachycardia during the immediate post-operative period. Electrocardiogram changes caused by muscle tremor during rigors could mimic a narrow complex tachycardia. Identification of muscle tremor as a cause of electrocardiogram artifact can avoid unnecessary pharmacological and non-pharmacological intervention to prevent arrhythmias.

  16. Increased scientific rigor will improve reliability of research and effectiveness of management

    Science.gov (United States)

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and

  17. Assessing HIV and AIDS treatment safety and health-related quality of life among cohort of Malaysian patients: a discussion on methodological approach.

    Science.gov (United States)

    Syed, Imran Ahmed; Syed Sulaiman, Syed Azhar; Hassali, Mohammad Azmi; Lee, Christopher K C

    2015-10-01

    Health-related quality of life (HRQoL) is increasingly recognized as an important outcome and as a complement to traditional biological end points of diseases such as mortality. Unless there is a complete cure available for HIV/AIDS, development and implementation of a reliable and valid cross cultural quality of life measure is necessary to assess not only the physical and medical needs of HIV/AIDS people, but their psychological, social, environmental, and spiritual areas of life. A qualitative exploration of HIV/AIDS patients' understanding, perceptions and expectations will be carried out with the help of semi structured interview guide by in depth interviews, while quantitative assessment of patient reported adverse drug reactions and their impact on health related quality of life will be carried out by using data collection tool comprising patient demographics, SF-12, Naranjo scale, and a clinical data sheet. The findings may serve as baseline QOL data of people living with HIV/AIDS in Malaysia and also a source data to aid construction of management plan to improve HIV/AIDS patients' QOL. It will also provide basic information about HIV/AIDS patients' perceptions, expectations and believes towards HIV/AIDS and its treatment which may help in designing strategies to enhance patients' awareness which in turn can help in addressing issues related to compliance and adherence. © 2013 John Wiley & Sons Ltd.

  18. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and

  19. Comprehensive Auditing in Nuclear Medicine Through the International Atomic Energy Agency Quality Management Audits in Nuclear Medicine (QUANUM) Program. Part 1: the QUANUM Program and Methodology.

    Science.gov (United States)

    Dondi, Maurizio; Torres, Leonel; Marengo, Mario; Massardo, Teresa; Mishani, Eyal; Van Zyl Ellmann, Annare; Solanki, Kishor; Bischof Delaloye, Angelika; Lobato, Enrique Estrada; Miller, Rodolfo Nunez; Paez, Diana; Pascual, Thomas

    2017-11-01

    An effective management system that integrates quality management is essential for a modern nuclear medicine practice. The Nuclear Medicine and Diagnostic Imaging Section of the International Atomic Energy Agency (IAEA) has the mission of supporting nuclear medicine practice in low- and middle-income countries and of helping them introduce it in their health-care system, when not yet present. The experience gathered over several years has shown diversified levels of development and varying degrees of quality of practice, among others because of limited professional networking and limited or no opportunities for exchange of experiences. Those findings triggered the development of a program named Quality Management Audits in Nuclear Medicine (QUANUM), aimed at improving the standards of NM practice in low- and middle-income countries to internationally accepted standards through the introduction of a culture of quality management and systematic auditing programs. QUANUM takes into account the diversity of nuclear medicine services around the world and multidisciplinary contributions to the practice. Those contributions include clinical, technical, radiopharmaceutical, and medical physics procedures. Aspects of radiation safety and patient protection are also integral to the process. Such an approach ensures consistency in providing safe services of superior quality to patients. The level of conformance is assessed using standards based on publications of the IAEA and the International Commission on Radiological Protection, and guidelines from scientific societies such as Society of Nuclear Medicine and Molecular Imaging (SNMMI) and European Association of Nuclear Medicine (EANM). Following QUANUM guidelines and by means of a specific assessment tool developed by the IAEA, auditors, both internal and external, will be able to evaluate the level of conformance. Nonconformances will then be prioritized and recommendations will be provided during an exit briefing. The

  20. Experimental evaluation of rigor mortis. VIII. Estimation of time since death by repeated measurements of the intensity of rigor mortis on rats.

    Science.gov (United States)

    Krompecher, T

    1994-10-21

    The development of the intensity of rigor mortis was monitored in nine groups of rats. The measurements were initiated after 2, 4, 5, 6, 8, 12, 15, 24, and 48 h post mortem (p.m.) and lasted 5-9 h, which ideally should correspond to the usual procedure after the discovery of a corpse. The experiments were carried out at an ambient temperature of 24 degrees C. Measurements initiated early after death resulted in curves with a rising portion, a plateau, and a descending slope. Delaying the initial measurement translated into shorter rising portions, and curves initiated 8 h p.m. or later are comprised of a plateau and/or a downward slope only. Three different phases were observed suggesting simple rules that can help estimate the time since death: (1) if an increase in intensity was found, the initial measurements were conducted not later than 5 h p.m.; (2) if only a decrease in intensity was observed, the initial measurements were conducted not earlier than 7 h p.m.; and (3) at 24 h p.m., the resolution is complete, and no further changes in intensity should occur. Our results clearly demonstrate that repeated measurements of the intensity of rigor mortis allow a more accurate estimation of the time since death of the experimental animals than the single measurement method used earlier. A critical review of the literature on the estimation of time since death on the basis of objective measurements of the intensity of rigor mortis is also presented.

  1. Symptomatic spinal metastasis: A systematic literature review of the preoperative prognostic factors for survival, neurological, functional and quality of life in surgically treated patients and methodological recommendations for prognostic studies.

    Directory of Open Access Journals (Sweden)

    Anick Nater

    Full Text Available While several clinical prediction rules (CPRs of survival exist for patients with symptomatic spinal metastasis (SSM, these have variable prognostic ability and there is no recognized CPR for health related quality of life (HRQoL. We undertook a critical appraisal of the literature to identify key preoperative prognostic factors of clinical outcomes in patients with SSM who were treated surgically. The results of this study could be used to modify existing or develop new CPRs.Seven electronic databases were searched (1990-2015, without language restriction, to identify studies that performed multivariate analysis of preoperative predictors of survival, neurological, functional and HRQoL outcomes in surgical patients with SSM. Individual studies were assessed for class of evidence. The strength of the overall body of evidence was evaluated using GRADE for each predictor.Among 4,818 unique citations, 17 were included; all were in English, rated Class III and focused on survival, revealing a total of 46 predictors. The strength of the overall body of evidence was very low for 39 and low for 7 predictors. Due to considerable heterogeneity in patient samples and prognostic factors investigated as well as several methodological issues, our results had a moderately high risk of bias and were difficult to interpret.The quality of evidence for predictors of survival was, at best, low. We failed to identify studies that evaluated preoperative prognostic factors for neurological, functional, or HRQoL outcomes in surgical patients with SSM. We formulated methodological recommendations for prognostic studies to promote acquiring high-quality evidence to better estimate predictor effect sizes to improve patient education, surgical decision-making and development of CPRs.

  2. Applying Lean Six Sigma methodologies to improve efficiency, timeliness of care, and quality of care in an internal medicine residency clinic.

    Science.gov (United States)

    Fischman, Daniel

    2010-01-01

    Patients' connectedness to their providers has been shown to influence the success of preventive health and disease management programs. Lean Six Sigma methodologies were employed to study workflow processes, patient-physician familiarity, and appointment compliance to improve continuity of care in an internal medicine residency clinic. We used a rapid-cycle test to evaluate proposed improvements to the baseline-identified factors impeding efficient clinic visits. Time-study, no-show, and patient-physician familiarity data were collected to evaluate the effect of interventions to improve clinic efficiency and continuity of medical care. Forty-seven patients were seen in each of the intervention and control groups. The wait duration between the end of triage and the resident-patient encounter was statistically shorter for the intervention group. Trends toward shorter wait times for medical assistant triage and total encounter were also seen in the intervention group. On all measures of connectedness, both the physicians and patients in the intervention group showed a statistically significant increased familiarity with each other. This study shows that incremental changes in workflow processes in a residency clinic can have a significant impact on practice efficiency and adherence to scheduled visits for preventive health care and chronic disease management. This project used a structured "Plan-Do-Study-Act" approach.

  3. Reforming science: methodological and cultural reforms.

    Science.gov (United States)

    Casadevall, Arturo; Fang, Ferric C

    2012-03-01

    Contemporary science has brought about technological advances and an unprecedented understanding of the natural world. However, there are signs of dysfunction in the scientific community as well as threats from diverse antiscience and political forces. Incentives in the current system place scientists under tremendous stress, discourage cooperation, encourage poor scientific practices, and deter new talent from entering the field. It is time for a discussion of how the scientific enterprise can be reformed to become more effective and robust. Serious reform will require more consistent methodological rigor and a transformation of the current hypercompetitive scientific culture.

  4. The MRI-Linear Accelerator Consortium: Evidence-Based Clinical Introduction of an Innovation in Radiation Oncology Connecting Researchers, Methodology, Data Collection, Quality Assurance, and Technical Development.

    Science.gov (United States)

    Kerkmeijer, Linda G W; Fuller, Clifton D; Verkooijen, Helena M; Verheij, Marcel; Choudhury, Ananya; Harrington, Kevin J; Schultz, Chris; Sahgal, Arjun; Frank, Steven J; Goldwein, Joel; Brown, Kevin J; Minsky, Bruce D; van Vulpen, Marco

    2016-01-01

    An international research consortium has been formed to facilitate evidence-based introduction of MR-guided radiotherapy (MR-linac) and to address how the MR-linac could be used to achieve an optimized radiation treatment approach to improve patients' survival, local, and regional tumor control and quality of life. The present paper describes the organizational structure of the clinical part of the MR-linac consortium. Furthermore, it elucidates why collaboration on this large project is necessary, and how a central data registry program will be implemented.

  5. Rigorous lower bound on the dynamic critical exponent of some multilevel Swendsen-Wang algorithms

    International Nuclear Information System (INIS)

    Li, X.; Sokal, A.D.

    1991-01-01

    We prove the rigorous lower bound z exp ≥α/ν for the dynamic critical exponent of a broad class of multilevel (or ''multigrid'') variants of the Swendsen-Wang algorithm. This proves that such algorithms do suffer from critical slowing down. We conjecture that such algorithms in fact lie in the same dynamic universality class as the stanard Swendsen-Wang algorithm

  6. Rigorous modelling of light's intensity angular-profile in Abbe refractometers with absorbing homogeneous fluids

    International Nuclear Information System (INIS)

    García-Valenzuela, A; Contreras-Tello, H; Márquez-Islas, R; Sánchez-Pérez, C

    2013-01-01

    We derive an optical model for the light intensity distribution around the critical angle in a standard Abbe refractometer when used on absorbing homogenous fluids. The model is developed using rigorous electromagnetic optics. The obtained formula is very simple and can be used suitably in the analysis and design of optical sensors relying on Abbe type refractometry.

  7. Rigorous approximation of stationary measures and convergence to equilibrium for iterated function systems

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Monge, Maurizio; Nisoli, Isaia

    2016-01-01

    We study the problem of the rigorous computation of the stationary measure and of the rate of convergence to equilibrium of an iterated function system described by a stochastic mixture of two or more dynamical systems that are either all uniformly expanding on the interval, either all contracting. In the expanding case, the associated transfer operators satisfy a Lasota–Yorke inequality, we show how to compute a rigorous approximations of the stationary measure in the L "1 norm and an estimate for the rate of convergence. The rigorous computation requires a computer-aided proof of the contraction of the transfer operators for the maps, and we show that this property propagates to the transfer operators of the IFS. In the contracting case we perform a rigorous approximation of the stationary measure in the Wasserstein–Kantorovich distance and rate of convergence, using the same functional analytic approach. We show that a finite computation can produce a realistic computation of all contraction rates for the whole parameter space. We conclude with a description of the implementation and numerical experiments. (paper)

  8. Double phosphorylation of the myosin regulatory light chain during rigor mortis of bovine Longissimus muscle.

    Science.gov (United States)

    Muroya, Susumu; Ohnishi-Kameyama, Mayumi; Oe, Mika; Nakajima, Ikuyo; Shibata, Masahiro; Chikuni, Koichi

    2007-05-16

    To investigate changes in myosin light chains (MyLCs) during postmortem aging of the bovine longissimus muscle, we performed two-dimensional gel electrophoresis followed by identification with matrix-assisted laser desorption ionization time-of-flight mass spectrometry. The results of fluorescent differential gel electrophoresis showed that two spots of the myosin regulatory light chain (MyLC2) at pI values of 4.6 and 4.7 shifted toward those at pI values of 4.5 and 4.6, respectively, by 24 h postmortem when rigor mortis was completed. Meanwhile, the MyLC1 and MyLC3 spots did not change during the 14 days postmortem. Phosphoprotein-specific staining of the gels demonstrated that the MyLC2 proteins at pI values of 4.5 and 4.6 were phosphorylated. Furthermore, possible N-terminal region peptides containing one and two phosphoserine residues were detected in each mass spectrum of the MyLC2 spots at pI values of 4.5 and 4.6, respectively. These results demonstrated that MyLC2 became doubly phosphorylated during rigor formation of the bovine longissimus, suggesting involvement of the MyLC2 phosphorylation in the progress of beef rigor mortis. Bovine; myosin regulatory light chain (RLC, MyLC2); phosphorylation; rigor mortis; skeletal muscle.

  9. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    Energy Technology Data Exchange (ETDEWEB)

    Botelho, Luiz C.L. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Matematica Aplicada]. E-mail: botelho.luiz@superig.com.br

    2008-07-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R{sup {infinity}}, we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  10. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    International Nuclear Information System (INIS)

    Botelho, Luiz C.L.

    2008-01-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R ∞ , we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  11. Community historians and the dilemma of rigor vs relevance : A comment on Danziger and van Rappard

    NARCIS (Netherlands)

    Dehue, Trudy

    1998-01-01

    Since the transition from finalism to contextualism, the history of science seems to be caught up in a basic dilemma. Many historians fear that with the new contextualist standards of rigorous historiography, historical research can no longer be relevant to working scientists themselves. The present

  12. Beyond the RCT: Integrating Rigor and Relevance to Evaluate the Outcomes of Domestic Violence Programs

    Science.gov (United States)

    Goodman, Lisa A.; Epstein, Deborah; Sullivan, Cris M.

    2018-01-01

    Programs for domestic violence (DV) victims and their families have grown exponentially over the last four decades. The evidence demonstrating the extent of their effectiveness, however, often has been criticized as stemming from studies lacking scientific rigor. A core reason for this critique is the widespread belief that credible evidence can…

  13. A plea for rigorous conceptual analysis as central method in transnational law design

    NARCIS (Netherlands)

    Rijgersberg, R.; van der Kaaij, H.

    2013-01-01

    Although shared problems are generally easily identified in transnational law design, it is considerably more difficult to design frameworks that transcend the peculiarities of local law in a univocal fashion. The following exposition is a plea for giving more prominence to rigorous conceptual

  14. College Readiness in California: A Look at Rigorous High School Course-Taking

    Science.gov (United States)

    Gao, Niu

    2016-01-01

    Recognizing the educational and economic benefits of a college degree, education policymakers at the federal, state, and local levels have made college preparation a priority. There are many ways to measure college readiness, but one key component is rigorous high school coursework. California has not yet adopted a statewide college readiness…

  15. Setting priorities for ambient air quality objectives

    International Nuclear Information System (INIS)

    2004-10-01

    Alberta has ambient air quality objectives in place for several pollutants, toxic substances and other air quality parameters. A process is in place to determine if additional air quality objectives are required or if existing objectives should be changed. In order to identify the highest priority substances that may require an ambient air quality objective to protect ecosystems and public health, a rigorous, transparent and cost effective priority setting methodology is required. This study reviewed, analyzed and assessed successful priority setting techniques used by other jurisdictions. It proposed an approach for setting ambient air quality objective priorities that integrates the concerns of stakeholders with Alberta Environment requirements. A literature and expert review were used to examine existing priority-setting techniques used by other jurisdictions. An analysis process was developed to identify the strengths and weaknesses of various techniques and their ability to take into account the complete pathway between chemical emissions and damage to human health or the environment. The key strengths and weaknesses of each technique were identified. Based on the analysis, the most promising technique was the tool for the reduction and assessment of chemical and other environmental impacts (TRACI). Several considerations for using TRACI to help set priorities for ambient air quality objectives were also presented. 26 refs, 8 tabs., 4 appendices

  16. Evaluation of physical dimension changes as nondestructive measurements for monitoring rigor mortis development in broiler muscles.

    Science.gov (United States)

    Cavitt, L C; Sams, A R

    2003-07-01

    Studies were conducted to develop a non-destructive method for monitoring the rate of rigor mortis development in poultry and to evaluate the effectiveness of electrical stimulation (ES). In the first study, 36 male broilers in each of two trials were processed at 7 wk of age. After being bled, half of the birds received electrical stimulation (400 to 450 V, 400 to 450 mA, for seven pulses of 2 s on and 1 s off), and the other half were designated as controls. At 0.25 and 1.5 h postmortem (PM), carcasses were evaluated for the angles of the shoulder, elbow, and wing tip and the distance between the elbows. Breast fillets were harvested at 1.5 h PM (after chilling) from all carcasses. Fillet samples were excised and frozen for later measurement of pH and R-value, and the remainder of each fillet was held on ice until 24 h postmortem. Shear value and pH means were significantly lower, but R-value means were higher (P rigor mortis by ES. The physical dimensions of the shoulder and elbow changed (P rigor mortis development and with ES. These results indicate that physical measurements of the wings maybe useful as a nondestructive indicator of rigor development and for monitoring the effectiveness of ES. In the second study, 60 male broilers in each of two trials were processed at 7 wk of age. At 0.25, 1.5, 3.0, and 6.0 h PM, carcasses were evaluated for the distance between the elbows. At each time point, breast fillets were harvested from each carcass. Fillet samples were excised and frozen for later measurement of pH and sacromere length, whereas the remainder of each fillet was held on ice until 24 h PM. Shear value and pH means (P rigor mortis development. Elbow distance decreased (P rigor development and was correlated (P rigor mortis development in broiler carcasses.

  17. Developing a set of consensus indicators to support maternity service quality improvement: using Core Outcome Set methodology including a Delphi process.

    Science.gov (United States)

    Bunch, K J; Allin, B; Jolly, M; Hardie, T; Knight, M

    2018-05-16

    To develop a core metric set to monitor the quality of maternity care. Delphi process followed by a face-to-face consensus meeting. English maternity units. Three representative expert panels: service designers, providers and users. Maternity care metrics judged important by participants. Participants were asked to complete a two-phase Delphi process, scoring metrics from existing local maternity dashboards. A consensus meeting discussed the results and re-scored the metrics. In all, 125 distinct metrics across six domains were identified from existing dashboards. Following the consensus meeting, 14 metrics met the inclusion criteria for the final core set: smoking rate at booking; rate of birth without intervention; caesarean section delivery rate in Robson group 1 women; caesarean section delivery rate in Robson group 2 women; caesarean section delivery rate in Robson group 5 women; third- and fourth-degree tear rate among women delivering vaginally; rate of postpartum haemorrhage of ≥1500 ml; rate of successful vaginal birth after a single previous caesarean section; smoking rate at delivery; proportion of babies born at term with an Apgar score improvement. Achieving consensus on core metrics for monitoring the quality of maternity care. © 2018 The Authors. BJOG: An International Journal of Obstetrics and Gynaecology published by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.

  18. A new methodology for cost-effectiveness studies of domestic radon remediation programmes: Quality-adjusted life-years gained within Primary Care Trusts in Central England

    International Nuclear Information System (INIS)

    Coskeran, Thomas; Denman, Antony; Phillips, Paul; Gillmore, Gavin; Tornberg, Roger

    2006-01-01

    Radon is a naturally occurring radioactive gas, high levels of which are associated with geological formations such as those found in Northamptonshire and North Oxfordshire in the UK. The UK's National Radiological Protection Board have designated both districts as radon Affected Areas. Radiation levels due to radon, therefore, exceed 200 Bq m -3 , the UK's domestic Action Level, in over one percent of domestic properties. Because of radon's radioactivity, exposure to the gas can potentially cause lung cancer, and has been linked to some 2000 deaths a year in the UK. Consequently, when radiation levels exceed the Action Level, remediation against radon's effects is recommended to householders. This study examines the cost-effectiveness of remediation measures in Northamptonshire and North Oxfordshire by estimating cost per quality-adjusted life-year gained in four Primary Care Trusts, organisations that play a key public health policy role in the UK's National Health Service. The study is the first to apply this approach to estimating the cost-effectiveness of radon remediation programmes. Central estimates of cost per quality-adjusted life-year in the four Primary Care Trusts range from Pounds 6143 to Pounds 10 323. These values, when assessed against generally accepted criteria, suggest the remediation programmes in the trusts were cost-effective. Policy suggestions based on the estimates, and designed to improve cost-effectiveness further, are proposed for the four Primary Care Trusts and the UK's National Health Service

  19. Development and Validation of an Agency for Healthcare Research and Quality Indicator for Mortality After Congenital Heart Surgery Harmonized With Risk Adjustment for Congenital Heart Surgery (RACHS-1) Methodology.

    Science.gov (United States)

    Jenkins, Kathy J; Koch Kupiec, Jennifer; Owens, Pamela L; Romano, Patrick S; Geppert, Jeffrey J; Gauvreau, Kimberlee

    2016-05-20

    The National Quality Forum previously approved a quality indicator for mortality after congenital heart surgery developed by the Agency for Healthcare Research and Quality (AHRQ). Several parameters of the validated Risk Adjustment for Congenital Heart Surgery (RACHS-1) method were included, but others differed. As part of the National Quality Forum endorsement maintenance process, developers were asked to harmonize the 2 methodologies. Parameters that were identical between the 2 methods were retained. AHRQ's Healthcare Cost and Utilization Project State Inpatient Databases (SID) 2008 were used to select optimal parameters where differences existed, with a goal to maximize model performance and face validity. Inclusion criteria were not changed and included all discharges for patients model includes procedure risk group, age (0-28 days, 29-90 days, 91-364 days, 1-17 years), low birth weight (500-2499 g), other congenital anomalies (Clinical Classifications Software 217, except for 758.xx), multiple procedures, and transfer-in status. Among 17 945 eligible cases in the SID 2008, the c statistic for model performance was 0.82. In the SID 2013 validation data set, the c statistic was 0.82. Risk-adjusted mortality rates by center ranged from 0.9% to 4.1% (5th-95th percentile). Congenital heart surgery programs can now obtain national benchmarking reports by applying AHRQ Quality Indicator software to hospital administrative data, based on the harmonized RACHS-1 method, with high discrimination and face validity. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  20. Guidance on Conducting and REporting DElphi Studies (CREDES) in palliative care: Recommendations based on a methodological systematic review.

    Science.gov (United States)

    Jünger, Saskia; Payne, Sheila A; Brine, Jenny; Radbruch, Lukas; Brearley, Sarah G

    2017-09-01

    The Delphi technique is widely used for the development of guidance in palliative care, having impact on decisions with relevance for patient care. To systematically examine the application of the Delphi technique for the development of best practice guidelines in palliative care. A methodological systematic review was undertaken using the databases PubMed, CINAHL, Web of Science, Academic Search Complete and EMBASE. Original articles (English language) were included when reporting on empirical studies that had used the Delphi technique to develop guidance for good clinical practice in palliative care. Data extraction included a quality appraisal on the rigour in conduct of the studies and the quality of reporting. A total of 30 empirical studies (1997-2015) were considered for full-text analysis. Considerable differences were identified regarding the rigour of the design and the reporting of essential process and outcome parameters. Furthermore, discrepancies regarding the use of terms for describing the method were observed, for example, concerning the understanding of a 'round' or a 'modified Delphi study'. Substantial variation was found concerning the quality of the study conduct and the transparency of reporting of Delphi studies used for the development of best practice guidance in palliative care. Since credibility of the resulting recommendations depends on the rigorous use of the Delphi technique, there is a need for consistency and quality both in the conduct and reporting of studies. To allow a critical appraisal of the methodology and the resulting guidance, a reporting standard for Conducting and REporting of DElphi Studies (CREDES) is proposed.

  1. Exploring the use of grounded theory as a methodological approach to examine the 'black box' of network leadership in the national quality forum.

    Science.gov (United States)

    Hoflund, A Bryce

    2013-01-01

    This paper describes how grounded theory was used to investigate the "black box" of network leadership in the creation of the National Quality Forum. Scholars are beginning to recognize the importance of network organizations and are in the embryonic stages of collecting and analyzing data about network leadership processes. Grounded theory, with its focus on deriving theory from empirical data, offers researchers a distinctive way of studying little-known phenomena and is therefore well suited to exploring network leadership processes. Specifically, this paper provides an overview of grounded theory, a discussion of the appropriateness of grounded theory to investigating network phenomena, a description of how the research was conducted, and a discussion of the limitations and lessons learned from using this approach.

  2. Using an innovative mixed method methodology to investigate the appropriateness of a quantitative instrument in an African context: Antiretroviral treatment and quality of life.

    Science.gov (United States)

    Greeff, Minrie; Chepuka, Lignet M; Chilemba, Winnie; Chimwaza, Angela F; Kululanga, Lucy I; Kgositau, Mabedi; Manyedi, Eva; Shaibu, Sheila; Wright, Susan C D

    2014-01-01

    The relationship between quality of life (QoL) and antiretroviral treatment (ART) has mainly been studied using quantitative scales often not appropriate for use in other contexts and without taking peoples' lived experiences into consideration. Sub-Saharan Africa has the highest incidence of HIV and AIDS yet there is paucity in research done on QoL. This research report is intended to give an account of the use of a mixed method convergent parallel design as a novice approach to evaluate an instrument's context specificity, appropriateness and usefulness in another context for which it was designed. Data were collected through a qualitative exploration of the experiences of QoL of people living with HIV or AIDS (PLHA) in Africa since being on ART, as well as the quantitative measurements obtained from the HIV/AIDS-targeted quality of life (HAT-QoL) instrument. This study was conducted in three African countries. Permission and ethical approval to conduct the study were obtained. Purposive voluntary sampling was used to recruit PLHA through mediators working in community-based HIV/AIDS organisations and health clinics. Interviews were analysed through open coding and the quantitative data through descriptive statistics and the Cronbach's alpha coefficient. A much wider range and richness of experiences were expressed than measured by the HAT-QoL instrument. Although an effective instrument for use in the USA, it was found not to be sensitive, appropriate and useful in an African context in its present form. The recommendations focus on adapting the instrument using the data from the in-depth interviews or to develop a context-sensitive instrument that could measure QoL of PLHA in Africa.

  3. From everyday communicative figurations to rigorous audience news repertoires: A mixed method approach to cross-media news consumption

    Directory of Open Access Journals (Sweden)

    Christian Kobbernagel

    2016-06-01

    Full Text Available In the last couple of decades there has been an unprecedented explosion of news media platforms and formats, as a succession of digital and social media have joined the ranks of legacy media. We live in a ‘hybrid media system’ (Chadwick, 2013, in which people build their cross-media news repertoires from the ensemble of old and new media available. This article presents an innovative mixed-method approach with considerable explanatory power to the exploration of patterns of news media consumption. This approach tailors Q-methodology in the direction of a qualitative study of news consumption, in which a card sorting exercise serves to translate the participants’ news media preferences into a form that enables the researcher to undertake a rigorous factor-analytical construction of their news consumption repertoires. This interpretive, factor-analytical procedure, which results in the building of six audience news repertoires in Denmark, also preserves the qualitative thickness of the participants’ verbal accounts of the communicative figurations of their day-in-the-life with the news media.

  4. Development of interface between MCNP-FISPACT-MCNP (IPR-MFM) based on rigorous two step method

    International Nuclear Information System (INIS)

    Shaw, A.K.; Swami, H.L.; Danani, C.

    2015-01-01

    In this work we present the development of interface tool between MCNP-FISPACT-MCNP (MFM) based on Rigorous Two Step method for the shutdown dose rate (SDDR) calculation. The MFM links MCNP radiation transport and the FISPACT inventory code through a suitable coupling scheme. MFM coupling scheme has three steps. In first step it picks neutron spectrum and total flux from MCNP output file to use as input parameter for FISPACT. It prepares the FISPACT input files by using irradiation history, neutron flux and neutron spectrum and then execute the FISPACT input file in the second step. Third step of MFM coupling scheme extracts the decay gammas from the FISPACT output file and prepares MCNP input file for decay gamma transport followed by execution of MCNP input file and estimation of SDDR. Here detailing of MFM methodology and flow scheme has been described. The programming language PYTHON has been chosen for this development of the coupling scheme. A complete loop of MCNP-FISPACT-MCNP has been developed to handle the simplified geometrical problems. For validation of MFM interface a manual cross-check has been performed which shows good agreements. The MFM interface also has been validated with exiting MCNP-D1S method for a simple geometry with 14 MeV cylindrical neutron source. (author)

  5. [A new formula for the measurement of rigor mortis: the determination of the FRR-index (author's transl)].

    Science.gov (United States)

    Forster, B; Ropohl, D; Raule, P

    1977-07-05

    The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.

  6. The quality of quality of life publications in the spinal literature: are we getting any better?

    Science.gov (United States)

    Street, John; Lenehan, Brian; Fisher, Charles

    2009-11-01

    Criteria for methodological quality have been widely accepted in many fields of surgical practice. These criteria include those of Velanovich and Gill and Feinstein. No such analysis of the spine surgery literature has ever been reported. This study is a systematic review of the quality of life (QOL) publications to determine if the recent interest in QOL measurements following spinal surgery has been accompanied by an improvement in the quality of the papers published. The archives of the journals Journal of Neurosurgery: Spine, Spine, Journal of Spinal Disorders & Techniques, European Spine Journal, and The Spine Journal, for the years 2000-2004 inclusive, were examined, and all publications reporting QOL outcomes were analyzed. Each paper was scored according to the criteria of Velanovich and Gill and Feinstein, and the methodological quality of these manuscripts-and any time-dependent changes-were determined. During the study period, the total number of articles published increased by 36%, while the number of QOL articles increased by 102%. According to the criteria of Velanovich, there was a statistically significant improvement in the quality of the publications over the study period (p = 0.0394). In 2000, only 27% of outcome measures were disease specific, 77% were valid, and 77% were appropriate for the study design. In 2004, 43% were disease specific, 88% were valid, and 89% were appropriate. In 2000, 53% of studies used appropriate statistical analysis compared with 100 and 96% for 2003 and 2004, respectively. There was no demonstrable improvement in the fulfillment of the more rigorous Gill and Feinstein criteria for any of the 5 journals over the period of the study. The authors' study illustrates a moderate improvement in the quality of these publications over the study period but much methodological improvement is required.

  7. Measurements of the degree of development of rigor mortis as an indicator of stress in slaughtered pigs.

    Science.gov (United States)

    Warriss, P D; Brown, S N; Knowles, T G

    2003-12-13

    The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.

  8. The use of a lot quality assurance sampling methodology to assess and manage primary health interventions in conflict-affected West Darfur, Sudan.

    Science.gov (United States)

    Pham, Kiemanh; Sharpe, Emily Chambers; Weiss, William M; Vu, Alexander

    2016-01-01

    Organizations working in conflict-affected areas have a need to monitor and evaluate their programs, however this is often difficult due to the logistical challenges of conflict areas. Lot quality assurance sampling may be a suitable method of assessing programs in these situations. We conducted a secondary data analysis of information collected during Medair's routine program management functions. Medair's service area in West Darfur, Sudan was divided into seven supervisory areas. Using the available population information, a sampling frame was developed and interviews were conducted from randomly selected caretakers of children in each supervisory area every six months over 19 months. A survey instrument with questions related to key indicators for immunizations and maternal, newborn, and child health was used for the interviews. Based on Medair's goals for each indicator, decision rules were calculated for the indicators; these decision rules determined which supervisory areas and indicators performed adequately in each assessment period. Pearson's chi-squared tests, adjusted for the survey design using STATA "svy: tab" commands, were used to detect overall differences in coverage in this analysis. The coverage of tetanus toxoid vaccination among pregnant women increased from 47.2 to 69.7 % ( p value = 0.046), and births attended by a skilled health professional increased from 35.7 to 52.7 % ( p value = 0.025) from the first to last assessment periods. Measles vaccinations declined from 72.0 to 54.1 % ( p value = 0.046). The estimated coverage for the proportion of women receiving a postpartum dose of vitamin A (54.7 to 61.3 %, p value = 0.44); pregnant women receiving a clean delivery kit (54.6 to 47.1 %, p value = 0.49); and pentavalent vaccinations (49.7 to 42.1 %, p value = 0.28) did not significantly change. Lot quality assurance sampling was a feasible method for Medair staff to evaluate and optimize primary health programs

  9. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  10. Methodological basis for the optimization of a marine sea-urchin embryo test (SET) for the ecological assessment of coastal water quality.

    Science.gov (United States)

    Saco-Alvarez, Liliana; Durán, Iria; Ignacio Lorenzo, J; Beiras, Ricardo

    2010-05-01

    The sea-urchin embryo test (SET) has been frequently used as a rapid, sensitive, and cost-effective biological tool for marine monitoring worldwide, but the selection of a sensitive, objective, and automatically readable endpoint, a stricter quality control to guarantee optimum handling and biological material, and the identification of confounding factors that interfere with the response have hampered its widespread routine use. Size increase in a minimum of n=30 individuals per replicate, either normal larvae or earlier developmental stages, was preferred to observer-dependent, discontinuous responses as test endpoint. Control size increase after 48 h incubation at 20 degrees C must meet an acceptability criterion of 218 microm. In order to avoid false positives minimums of 32 per thousand salinity, 7 pH and 2mg/L oxygen, and a maximum of 40 microg/L NH(3) (NOEC) are required in the incubation media. For in situ testing size increase rates must be corrected on a degree-day basis using 12 degrees C as the developmental threshold. Copyright 2010 Elsevier Inc. All rights reserved.

  11. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  12. Formalizing the ISDF Software Development Methodology

    OpenAIRE

    Mihai Liviu DESPA

    2015-01-01

    The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of wha...

  13. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  14. Use of the Rigor Mortis Process as a Tool for Better Understanding of Skeletal Muscle Physiology: Effect of the Ante-Mortem Stress on the Progression of Rigor Mortis in Brook Charr (Salvelinus fontinalis).

    Science.gov (United States)

    Diouf, Boucar; Rioux, Pierre

    1999-01-01

    Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…

  15. Observational methodology in sport sciences

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2013-11-01

    Full Text Available This paper reviews the conceptual framework, the key literature and the methods (observation tools, such as category systems and field formats, and coding software, etc. that should be followed when conducting research from the perspective of observational methodology. The observational designs used by the authors’ research group over the last twenty years are discussed, and the procedures for analysing data and assessing their quality are described. Mention is also made of the latest methodological trends in this field, such as the use of mixed methods.

  16. A rigorous pole representation of multilevel cross sections and its practical applications

    International Nuclear Information System (INIS)

    Hwang, R.N.

    1987-01-01

    In this article a rigorous method for representing the multilevel cross sections and its practical applications are described. It is a generalization of the rationale suggested by de Saussure and Perez for the s-wave resonances. A computer code WHOPPER has been developed to convert the Reich-Moore parameters into the pole and residue parameters in momentum space. Sample calculations have been carried out to illustrate that the proposed method preserves the rigor of the Reich-Moore cross sections exactly. An analytical method has been developed to evaluate the pertinent Doppler-broadened line shape functions. A discussion is presented on how to minimize the number of pole parameters so that the existing reactor codes can be best utilized

  17. Rigorous simulations of a helical core fiber by the use of transformation optics formalism.

    Science.gov (United States)

    Napiorkowski, Maciej; Urbanczyk, Waclaw

    2014-09-22

    We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.

  18. Estimation of the time since death--reconsidering the re-establishment of rigor mortis.

    Science.gov (United States)

    Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter

    2013-01-01

    In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.

  19. Derivation of basic equations for rigorous dynamic simulation of cryogenic distillation column for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Kinoshita, Masahiro; Naruse, Yuji

    1981-08-01

    The basic equations are derived for rigorous dynamic simulation of cryogenic distillation columns for hydrogen isotope separation. The model accounts for such factors as differences in latent heat of vaporization among the six isotopic species of molecular hydrogen, decay heat of tritium, heat transfer through the column wall and nonideality of the solutions. Provision is also made for simulation of columns with multiple feeds and multiple sidestreams. (author)

  20. Rigor mortis development in turkey breast muscle and the effect of electrical stunning.

    Science.gov (United States)

    Alvarado, C Z; Sams, A R

    2000-11-01

    Rigor mortis development in turkey breast muscle and the effect of electrical stunning on this process are not well characterized. Some electrical stunning procedures have been known to inhibit postmortem (PM) biochemical reactions, thereby delaying the onset of rigor mortis in broilers. Therefore, this study was designed to characterize rigor mortis development in stunned and unstunned turkeys. A total of 154 turkey toms in two trials were conventionally processed at 20 to 22 wk of age. Turkeys were either stunned with a pulsed direct current (500 Hz, 50% duty cycle) at 35 mA (40 V) in a saline bath for 12 seconds or left unstunned as controls. At 15 min and 1, 2, 4, 8, 12, and 24 h PM, pectoralis samples were collected to determine pH, R-value, L* value, sarcomere length, and shear value. In Trial 1, the samples obtained for pH, R-value, and sarcomere length were divided into surface and interior samples. There were no significant differences between the surface and interior samples among any parameters measured. Muscle pH significantly decreased over time in stunned and unstunned birds through 2 h PM. The R-values increased to 8 h PM in unstunned birds and 24 h PM in stunned birds. The L* values increased over time, with no significant differences after 1 h PM for the controls and 2 h PM for the stunned birds. Sarcomere length increased through 2 h PM in the controls and 12 h PM in the stunned fillets. Cooked meat shear values decreased through the 1 h PM deboning time in the control fillets and 2 h PM in the stunned fillets. These results suggest that stunning delayed the development of rigor mortis through 2 h PM, but had no significant effect on the measured parameters at later time points, and that deboning turkey breasts at 2 h PM or later will not significantly impair meat tenderness.