WorldWideScience

Sample records for improvement measurement methodology

  1. THE MEASUREMENT METHODOLOGY IMPROVEMENT OF THE HORIZONTAL IRREGULARITIES IN PLAN

    Directory of Open Access Journals (Sweden)

    O. M. Patlasov

    2015-08-01

    Full Text Available Purpose. Across the track superstructure (TSS there are structures where standard approach to the decision on the future of their operation is not entirely correct or acceptable. In particular, it concerns the track sections which are sufficiently quickly change their geometric parameters: the radius of curvature, angle of rotation, and the like. As an example, such portions of TSS may include crossovers where their component is within the so-called connecting part, which at a sufficiently short length, substantially changes curvature. The estimation of the position in terms of a design on the basis of the existing technique (by the difference in the adjacent arrows bending is virtually impossible. Therefore it is proposed to complement and improve the methodology for assessing the situation of the curve in plan upon difference in the adjacent versine. Methodology. The possible options for measuring horizontal curves in the plan were analyzed. The most adequate method, which does not contradict existing on the criterion of the possibility of using established standards was determined. The ease of measurement and calculation was took into account. Findings. Qualitative and quantitative verification of the proposed and existing methods showed very good agreement of the measurement results. This gives grounds to assert that this methodology can be recommended to the workers of track facilities in the assessment of horizontal irregularities in plan not only curves, but also within the connecting part of switch congresses. Originality. The existing method of valuation of the geometric position of the curves in the plan was improved. It does not create new regulations, and all results are evaluated by existing norms. Practical value. The proposed technique makes it possible, without creating a new regulatory framework, to be attached to existing one, and expanding the boundaries of its application. This method can be used not only for ordinary curves

  2. Suspended matter concentrations in coastal waters: Methodological improvements to quantify individual measurement uncertainty

    Science.gov (United States)

    Röttgers, Rüdiger; Heymann, Kerstin; Krasemann, Hajo

    2014-12-01

    Measurements of total suspended matter (TSM) concentration and the discrimination of the particulate inorganic (PIM) and organic matter fraction by the loss on ignition methods are susceptible to significant and contradictory bias errors by: (a) retention of sea salt in the filter (despite washing with deionized water), and (b) filter material loss during washing and combustion procedures. Several methodological procedures are described to avoid or correct errors associated with these biases but no analysis of the final uncertainty for the overall mass concentration determination has yet been performed. Typically, the exact values of these errors are unknown and can only be estimated. Measurements were performed in coastal and estuarine waters of the German Bight that allowed the individual error for each sample to be determined with respect to a systematic mass offset. This was achieved by using different volumes of the sample and analyzing the mass over volume relationship by linear regression. The results showed that the variation in the mass offset is much larger than expected (mean mass offset: 0.85 ± 0.84 mg, range: -2.4 - 7.5 mg) and that it often leads to rather large relative errors even when TSM concentrations were high. Similarly large variations were found for the mass offset for PIM measurements. Correction with a mean offset determined with procedural control filters reduced the maximum error to errors for the TSM concentration was error was error was always errors of only a few percent were obtained. The approach proposed here can determine the individual determination error for each sample, is independent of bias errors, can be used for TSM and PIM determination, and allows individual quality control for samples from coastal and estuarine waters. It should be possible to use the approach in oceanic or fresh water environments as well. The possibility of individual quality control will allow mass-specific optical properties to be determined with

  3. Methodology of Pilot Performance Measurements

    Directory of Open Access Journals (Sweden)

    Peter Kalavsky

    2017-04-01

    Full Text Available The article is devoted to the development of the methodology of measuring pilot performance under real flight conditions. It provides the basic information on a research project realized to obtain new information regarding training and education of pilots. The introduction is focused on the analytical part of the project and the outputs in terms of the current state of the art. Detailed view is cast on the issue of measuring pilot performance under specific conditions of the cockpit or the flight simulator. The article is zooming in on the two selected and developed methods of pilot performance in terms of the defined indicators evaluated, conditions of compliance for conducting research and procedures of the methodology of pilot performance measurements.

  4. Improving Learning Outcome Using Six Sigma Methodology

    Science.gov (United States)

    Tetteh, Godson A.

    2015-01-01

    Purpose: The purpose of this research paper is to apply the Six Sigma methodology to identify the attributes of a lecturer that will help improve a student's prior knowledge of a discipline from an initial "x" per cent knowledge to a higher "y" per cent of knowledge. Design/methodology/approach: The data collection method…

  5. IMPROVEMENT METHODOLOGY FINANCIAL RISK-MANAGEMENT

    OpenAIRE

    E. Kachalova

    2016-01-01

    The article examines the vital issues of improvement methodology financial risk-management. The author reveals the economic essence of the concept of «financial risk-management». Methodological approaches for the efficient management of risks in the system of risk-management in Russia.

  6. Improving Learning Outcome Using Six Sigma Methodology

    Science.gov (United States)

    Tetteh, Godson A.

    2015-01-01

    Purpose: The purpose of this research paper is to apply the Six Sigma methodology to identify the attributes of a lecturer that will help improve a student's prior knowledge of a discipline from an initial "x" per cent knowledge to a higher "y" per cent of knowledge. Design/methodology/approach: The data collection method…

  7. Precision Measurement Physics and Its Methodology

    Institute of Scientific and Technical Information of China (English)

    Chao-hui YE; Jia-ming LI; Jun LUO

    2009-01-01

    @@ Precision Measurement Physics deals with the frontier problems in science, and plays a multi-disciplinary and fundamental role in strongly advancing the sciences. It is well known that an improvement of measuring precision in physics by an order of magnitude often implies a new or unknown effect to be explored, consequently even a new physical law to be established. There is no doubt that the development of modern physics is so closely related to precision measurements. During the past few decades, the methodologies and physics of precision measurements have achieved tremendous breakthroughs, and therefore have extended our understanding of the physics world.In addition, some technologies developed in this process have been applied in our daily life. Also, we can expect Precision Measurement Physics to make more significant progress toward verifying the basic laws of physics, and to find more practical applications in the future.

  8. Using Q Methodology in Quality Improvement Projects.

    Science.gov (United States)

    Tiernon, Paige; Hensel, Desiree; Roy-Ehri, Leah

    Q methodology consists of a philosophical framework and procedures to identify subjective viewpoints that may not be well understood, but its use in nursing is still quite limited. We describe how Q methodology can be used in quality improvement projects to better understand local viewpoints that act as facilitators or barriers to the implementation of evidence-based practice. We describe the use of Q methodology to identify nurses' attitudes about the provision of skin-to-skin care after cesarean birth. Copyright © 2017 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published by Elsevier Inc. All rights reserved.

  9. Productivity improvement of high end cnc machines by dmaic methodology

    OpenAIRE

    Veeresh Bhusnur; Dr. Bhimasen Soragaon; Hemanth Kumar C

    2017-01-01

    This research mainly emphasizes on productivity improvement with the application of DMAIC (Define, Measure, Analyze, Measure, Improve, Control) which is sub methodology of Six Sigma. It shows the application of Six Sigma in Auma India Pvt. Ltd. to reduce the cycle time and set-up times of High End CNC machines. At Auma, one of the most critical problem is that the existing production rate cannot meet the customer demands. This work was focused on improving the production rate of CNC machines ...

  10. Developing enterprise collaboration: a methodology to implement and improve interoperability

    Science.gov (United States)

    Daclin, Nicolas; Chen, David; Vallespir, Bruno

    2016-06-01

    The aim of this article is to present a methodology for guiding enterprises to implement and improve interoperability. This methodology is based on three components: a framework of interoperability which structures specific solutions of interoperability and is composed of abstraction levels, views and approaches dimensions; a method to measure interoperability including interoperability before (maturity) and during (operational performances) a partnership; and a structured approach defining the steps of the methodology, from the expression of an enterprise's needs to implementation of solutions. The relationship which consistently relates these components forms the methodology and enables developing interoperability in a step-by-step manner. Each component of the methodology and the way it operates is presented. The overall approach is illustrated in a case study example on the basis of a process between a given company and its dealers. Conclusions and future perspectives are given at the end of the article.

  11. On improving research methodology in clinical trials.

    Science.gov (United States)

    Berger, Vance W; Matthews, J Rosser; Grosch, Eric N

    2008-06-01

    Research plays a vital role within biomedicine. Scientifically appropriate research provides a basis for appropriate medical decisions; conversely, inappropriate research may lead to flawed ;best medical practices' which, when followed, contribute to avoidable morbidity and mortality. Although an all-encompassing definition of ;appropriate medical research' is beyond the scope of this article, the concept clearly entails (among other things) that research methods be continually revised and updated as better methods become available. Despite the advent of evidence-based medicine, many research methods have become ;standard' even though there are legitimate scientific reasons to question the conclusions reached by such methods. We first illustrate prominent examples of inappropriate (yet regimented) research methods that are in widespread use. Second, as a way to improve the situation, we suggest a model of research that relies on standardized statistical analyses that individual researchers must consider as a default, but are free to challenge when they can marshal sufficient scientific evidence to demonstrate that the challenge is warranted. Third, we characterize the current system as analogous to ;unnatural selection' in the biological world and argue that our proposed model of research will enable ;natural' to replace ;unnatural' selection in the choice of research methodologies. Given the pervasiveness of inappropriate research methods, we believe that there are strong scientific and ethical reasons to create such a system, that, if properly designed, will both facilitate creativity and ensure methodological rigor while protecting the public at large from the threats posed by poor medical treatment decisions resulting from flawed research methodology.

  12. Improved methodology for generating controlled test atmospheres.

    Science.gov (United States)

    Miller, R R; Letts, R L; Potts, W J; McKenna, M J

    1980-11-01

    Improved methodology has been developed for generating controlled test atmospheres. Vaporization of volatile liquids is accomplished in a 28 mm (O.D.) glass J-tube in conjunction with a compressed air flameless heat torch, a pressure-sensitive switch, and a positive displacement piston pump. The vaporization system has been very reliable with a variety of test materials in studies ranging from a few days to several months. The J-tube vaporization assembly minimizes the possibility of thermal decomposition of the test material and affords a better margin of safety when vaporizing potentially explosive materials.

  13. A methodology for combining multiple commercial data sources to improve measurement of the food and alcohol environment: applications of geographical information systems

    Directory of Open Access Journals (Sweden)

    Dara D. Mendez

    2014-11-01

    Full Text Available Commercial data sources have been increasingly used to measure and locate community resources. We describe a methodology for combining and comparing the differences in commercial data of the food and alcohol environment. We used commercial data from two commercial databases (InfoUSA and Dun&Bradstreet for 2003 and 2009 to obtain infor- mation on food and alcohol establishments and developed a matching process using computer algorithms and manual review by applying ArcGIS to geocode addresses, standard industrial classification and North American industry classification tax- onomy for type of establishment and establishment name. We constructed population and area-based density measures (e.g. grocery stores and assessed differences across data sources and used ArcGIS to map the densities. The matching process resulted in 8,705 and 7,078 unique establishments for 2003 and 2009, respectively. There were more establishments cap- tured in the combined dataset than relying on one data source alone, and the additional establishments captured ranged from 1,255 to 2,752 in 2009. The correlations for the density measures between the two data sources was highest for alcohol out- lets (r = 0.75 and 0.79 for per capita and area, respectively and lowest for grocery stores/supermarkets (r = 0.32 for both. This process for applying geographical information systems to combine multiple commercial data sources and develop meas- ures of the food and alcohol environment captured more establishments than relying on one data source alone. This replic- able methodology was found to be useful for understanding the food and alcohol environment when local or public data are limited.

  14. Methodological issues in cytokine measurement in schizophrenia

    Directory of Open Access Journals (Sweden)

    Maju Mathew Koola

    2016-01-01

    Full Text Available There is mounting evidence that inflammation is a major factor in the pathophysiology of schizophrenia. Inflammatory status is commonly ascertained by measuring peripheral cytokine concentrations. An issue concerning research on inflammation and schizophrenia relates to assay methodology. Enzyme-linked immunosorbent assay (ELISA is the most widely used and the gold standard method used to measure cytokine concentrations. ELISA has a number of limitations. Both ELISA and multiplex are limited by not being able to distinguish between bioactive and inactive molecules and the matrix and heterophilic (auto- antibody interference. Multiplex assays when combined with gene expression analysis and flow cytometry techniques such as fluorescence-activated cell sorting may be useful to detect abnormalities in specific immune pathways. Peripheral blood mononuclear cells cultures, to evaluate in vitro lipopolysaccharide-induced cytokine production, may be a better technology than measuring cytokines in the serum. The purpose of this paper is to shed light on major methodological issues that need to be addressed in order to advance the study of cytokines in schizophrenia. We make a few recommendations on how to address these issues.

  15. Towards methodological improvement in the Spanish studies

    Directory of Open Access Journals (Sweden)

    Beatriz Amante García

    2012-09-01

    Full Text Available The European Higher Education Area (EHEA has triggered many changes in the new degrees in Spanish universities, mainly in terms of methodology and assessment. However, in order to make such changes a success it is essential to have coordination within the teaching staff as well as active methodologies in use, which enhance and encourage students’ participation in all the activities carried out in the classroom. Most of all, when dealing with formative and summative evaluation, in which students become the ones responsible for their own learning process (López-Pastor, 2009; Torre, 2008. In this second issue of JOTSE we have included several teaching innovation experiences related to the above mentioned methodological and assessment changes.

  16. How Six Sigma Methodology Improved Doctors' Performance

    Science.gov (United States)

    Zafiropoulos, George

    2015-01-01

    Six Sigma methodology was used in a District General Hospital to assess the effect of the introduction of an educational programme to limit unnecessary admissions. The performance of the doctors involved in the programme was assessed. Ishikawa Fishbone and 5 S's were initially used and Pareto analysis of their findings was performed. The results…

  17. Directional reflectance characterization facility and measurement methodology

    Science.gov (United States)

    McGuckin, B. T.; Haner, D. A.; Menzies, R. T.; Esproles, C.; Brothers, A. M.

    1996-08-01

    A precision reflectance characterization facility, constructed specifically for the measurement of the bidirectional reflectance properties of Spectralon panels planned for use as in-flight calibrators on the NASA Multiangle Imaging Spectroradiometer (MISR) instrument is described. The incident linearly polarized radiation is provided at three laser wavelengths: 442, 632.8, and 859.9 nm. Each beam is collimated when incident on the Spectralon. The illuminated area of the panel is viewed with a silicon photodetector that revolves around the panel (360 ) on a 30-cm boom extending from a common rotational axis. The reflected radiance detector signal is ratioed with the signal from a reference detector to minimize the effect of amplitude instabilities in the laser sources. This and other measures adopted to reduce noise have resulted in a bidirectional reflection function (BRF) calibration facility with a measurement precision with regard to a BRF measurement of 0.002 at the 1 confidence level. The Spectralon test piece panel is held in a computer-controlled three-axis rotational assembly capable of a full 360 rotation in the horizontal plane and 90 in the vertical. The angular positioning system has repeatability and resolution of 0.001 . Design details and an outline of the measurement methodology are presented.

  18. On Improving the Experiment Methodology in Pedagogical Research

    Science.gov (United States)

    Horakova, Tereza; Houska, Milan

    2014-01-01

    The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…

  19. A Dynamic Methodology for Improving the Search Experience

    OpenAIRE

    2006-01-01

    In the early years of modern information retrieval, the fundamental way in which we understood and evaluated search performance was by measuring precision and recall. In recent decades, however, models of evaluation have expanded to incorporate the information-seeking task and the quality of its outcome, as well as the value of the information to the user. We have developed a systems engineering-based methodology for improving the whole search experience. The approach focuses on understanding...

  20. Informationally complete measurements from compressed sensing methodology

    Science.gov (United States)

    Kalev, Amir; Riofrio, Carlos; Kosut, Robert; Deutsch, Ivan

    2015-03-01

    Compressed sensing (CS) is a technique to faithfully estimate an unknown signal from relatively few data points when the measurement samples satisfy a restricted isometry property (RIP). Recently this technique has been ported to quantum information science to perform tomography with a substantially reduced number of measurement settings. In this work we show that the constraint that a physical density matrix is positive semidefinite provides a rigorous connection between the RIP and the informational completeness (IC) of a POVM used for state tomography. This enables us to construct IC measurements that are robust to noise using tools provided by the CS methodology. The exact recovery no longer hinges on a particular convex optimization program; solving any optimization, constrained on the cone of positive matrices, effectively results in a CS estimation of the state. From a practical point of view, we can therefore employ fast algorithms developed to handle large dimensional matrices for efficient tomography of quantum states of a large dimensional Hilbert space. Supported by the National Science Foundation.

  1. Methodology for high accuracy contact angle measurement.

    Science.gov (United States)

    Kalantarian, A; David, R; Neumann, A W

    2009-12-15

    A new version of axisymmetric drop shape analysis (ADSA) called ADSA-NA (ADSA-no apex) was developed for measuring interfacial properties for drop configurations without an apex. ADSA-NA facilitates contact angle measurements on drops with a capillary protruding into the drop. Thus a much simpler experimental setup, not involving formation of a complete drop from below through a hole in the test surface, may be used. The contact angles of long-chained alkanes on a commercial fluoropolymer, Teflon AF 1600, were measured using the new method. A new numerical scheme was incorporated into the image processing to improve the location of the contact points of the liquid meniscus with the solid substrate to subpixel resolution. The images acquired in the experiments were also analyzed by a different drop shape technique called theoretical image fitting analysis-axisymmetric interfaces (TIFA-AI). The results were compared with literature values obtained by means of the standard ADSA for sessile drops with the apex. Comparison of the results from ADSA-NA with those from TIFA-AI and ADSA reveals that, with different numerical strategies and experimental setups, contact angles can be measured with an accuracy of less than 0.2 degrees. Contact angles and surface tensions measured from drops with no apex, i.e., by means of ADSA-NA and TIFA-AI, were considerably less scattered than those from complete drops with apex. ADSA-NA was also used to explore sources of improvement in contact angle resolution. It was found that using an accurate value of surface tension as an input enhances the accuracy of contact angle measurements.

  2. Defect reduction methodologies: pellicle yield improvement

    Science.gov (United States)

    Daugherty, Susan V.

    1993-03-01

    The pelliclization process at Intel during the first half of 1991 was not in control. Weekly process yield was trending downward, and the range of the weekly yield during that time frame was greater than 40%. A focused effort in process yield improvement, that started in the second half of 1991 and continued through 1992, brought process yield up an average of 20%, and reduced the range of the process yield to 20 - 25%. This paper discusses the continuous process improvement guidelines that are being followed to reduce variations/defects in the pelliclization process. Teamwork tools, such as Pareto charts, fishbone diagrams, and simple experiments, prioritize efforts and help find the root cause of the defects. Best known methods (BKM), monitors, PMs, and excursion control aid in the elimination and prevention of defects. Monitoring progress and repeating the whole procedure are the final two guidelines. The benefits from the use of the continuous process improvement guidelines and tools can be seen in examples of the actions, impacts, and results for the last half of 1991 and the first half of 1992.

  3. PM10 source measurement methodology: Field studies

    Energy Technology Data Exchange (ETDEWEB)

    Farthing, W.E.; Martin, R.S.; Dawes, S.S.; Williamson, A.D.

    1989-05-01

    Two candidate measurement methods, Constant Sampling Rate (CSR) and Exhaust Gas Recycle (EGR), have been developed to measure emissions of in-stack PM-10 particulate matter with aerodynamic diameter less than 10 micrometers. Two field tests were performed at the clinker cooler exhaust of a Portland cement plant to quantify precision and comparability of these techniques. In addition, accuracy was determined for total particulate measurement by comparison to Method 17. Collocated sampling trains were operated parallel with two Method 17 trains. In the second test, two CSR and one EGR trains were operated parallel to two Method 17 trains. The operating procedures used for the CSR and EGR trains are described in detail. In measurement of PM-10 and total particulate matter, the precision of both the CSR and EGR techniques was found to be of the same magnitude as Method 17 (approximately 5%). A small bias was found between CSR and EGR PM-10 results (15%) and between EGR and Method 17 total particulate matter (10%). Although small, these observed differences, combined with the results of laboratory studies reported elsewhere, led to a recommendation for an increase in the length of sampling nozzles. This modification improved cyclone performance and is incorporated into the nozzle geometries described in the application guides for CSR and EGR.

  4. Application of Action Research Methodology in Improving the ...

    African Journals Online (AJOL)

    Application of Action Research Methodology in Improving the Processing Quality of ... rice in local markets are putting many farmers and women processors out of business. The paper shares results of an action research process that led to the ...

  5. Using Six Sigma and Lean methodologies to improve OR throughput.

    Science.gov (United States)

    Fairbanks, Catharine B

    2007-07-01

    Improving patient flow in the perioperative environment is challenging, but it has positive implications for both staff members and for the facility. One facility in vermont improved patient throughput by incorporating Six Sigma and Lean methodologies for patients undergoing elective procedures. The results of the project were significantly improved patient flow and increased teamwork and pride among perioperative staff members.

  6. Improving Training in Methodology Enriches the Science of Psychology

    Science.gov (United States)

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2009-01-01

    Replies to the comment Ramifications of increased training in quantitative methodology by Herbet Zimiles on the current authors original article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America". The…

  7. Stochastic Optimization of Supply Chain Risk Measures –a Methodology for Improving Supply Security of Subsidized Fuel Oil in Indonesia

    Directory of Open Access Journals (Sweden)

    Adinda Yuanita

    2015-08-01

    Full Text Available Monte Carlo simulation-based methods for stochastic optimization of risk measures is required to solve complex problems in supply security of subsidized fuel oil in Indonesia. In order to overcome constraints in distribution of subsidized fuel in Indonesia, which has the fourth largest population in the world—more than 250,000,000 people with 66.5% of productive population, and has more than 17,000 islands with its population centered around the nation's capital only—it is necessary to have a measurable and integrated risk analysis with monitoring system for the purpose of supply security of subsidized fuel. In consideration of this complex issue, uncertainty and probability heavily affected this research. Therefore, this research did the Monte Carlo sampling-based stochastic simulation optimization with the state-of-the-art "FIRST" parameter combined with the Sensitivity Analysis to determine the priority of integrated risk mitigation handling so that the implication of the new model design from this research may give faster risk mitigation time. The results of the research identified innovative ideas of risk based audit on supply chain risk management and new FIRST (Fairness, Independence, Reliable, Sustainable, Transparent parameters on risk measures. In addition to that, the integration of risk analysis confirmed the innovative level of priority on sensitivity analysis. Moreover, the findings showed that the new risk mitigation time was 60% faster than the original risk mitigation time.

  8. [Improving inpatient pharmacoterapeutic process by Lean Six Sigma methodology].

    Science.gov (United States)

    Font Noguera, I; Fernández Megía, M J; Ferrer Riquelme, A J; Balasch I Parisi, S; Edo Solsona, M D; Poveda Andres, J L

    2013-01-01

    Lean Six Sigma methodology has been used to improve care processes, eliminate waste, reduce costs, and increase patient satisfaction. To analyse the results obtained with Lean Six Sigma methodology in the diagnosis and improvement of the inpatient pharmacotherapy process during structural and organisational changes in a tertiary hospital. 1.000 beds tertiary hospital. prospective observational study. The define, measure, analyse, improve and control (DMAIC), were deployed from March to September 2011. An Initial Project Charter was updated as results were obtained. 131 patients with treatments prescribed within 24h after admission and with 4 drugs. safety indicators (medication errors), and efficiency indicators (complaints and time delays). Proportion of patients with a medication error was reduced from 61.0% (25/41 patients) to 55.7% (39/70 patients) in four months. Percentage of errors (regarding the opportunities for error) decreased in the different phases of the process: Prescription: from 5.1% (19/372 opportunities) to 3.3% (19/572 opportunities); Preparation: from 2.7% (14/525 opportunities) to 1.3% (11/847 opportunities); and administration: from 4.9% (16/329 opportunities) to 3.0% (13/433 opportunities). Nursing complaints decreased from 10.0% (2119/21038 patients) to 5.7% (1779/31097 patients). The estimated economic impact was 76,800 euros saved. An improvement in the pharmacotherapeutic process and a positive economic impact was observed, as well as enhancing patient safety and efficiency of the organization. Standardisation and professional training are future Lean Six Sigma candidate projects. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.

  9. A Dynamic Methodology for Improving the Search Experience

    Directory of Open Access Journals (Sweden)

    Marcia D. Kerchner

    2006-06-01

    Full Text Available In the early years of modern information retrieval, the fundamental way in which we understood and evaluated search performance was by measuring precision and recall. In recent decades, however, models of evaluation have expanded to incorporate the information-seeking task and the quality of its outcome, as well as the value of the information to the user. We have developed a systems engineering-based methodology for improving the whole search experience. The approach focuses on understanding users’ information-seeking problems, understanding who has the problems, and applying solutions that address these problems. This information is gathered through ongoing analysis of site-usage reports, satisfaction surveys, Help Desk reports, and a working relationship with the business owners.

  10. Measuring to improve.

    Science.gov (United States)

    Klein, R; Bobbitt, M

    1995-01-01

    Rush Prudential Health Plans, a managed care company located in Chicago, Illinois, is implementing a service quality improvement process across the three products it markets in the Chicago area: The Anchor Plan (a primarily staff model HMO), The Affiliates Plan (a network model HMO), and The Plus Plan (a point of service plan). In 1994, the company instituted an annual member satisfaction research study, conducted across the three plans, and began building a link between external customer requirements and internal operations. The research process consisted of three stages: determining external customer requirements, translating these customer-defined "symptoms" into underlying root causes, and developing a service quality improvement action plan. Rush Prudential determined that traditional "report card" surveys would not meet their goals for the information measurement process. A detailed diagnostic telephone survey was used to provide a picture of the entire clinical encounter, from scheduling an appointment through the time a member left the physician's office.

  11. Rating methodological quality: toward improved assessment and investigation.

    Science.gov (United States)

    Moyer, Anne; Finney, John W

    2005-01-01

    Assessing methodological quality is considered essential in deciding what investigations to include in research syntheses and in detecting potential sources of bias in meta-analytic results. Quality assessment is also useful in characterizing the strengths and limitations of the research in an area of study. Although numerous instruments to measure research quality have been developed, they have lacked empirically-supported components. In addition, different summary quality scales have yielded different findings when they were used to weight treatment effect estimates for the same body of research. Suggestions for developing improved quality instruments include: distinguishing distinct domains of quality, such as internal validity, external validity, the completeness of the study report, and adherence to ethical practices; focusing on individual aspects, rather than domains of quality; and focusing on empirically-verified criteria. Other ways to facilitate the constructive use of quality assessment are to improve and standardize the reporting of research investigations, so that the quality of studies can be more equitably and thoroughly compared, and to identify optimal methods for incorporating study quality ratings into meta-analyses.

  12. Sustainable Food Security Measurement: A Systemic Methodology

    Science.gov (United States)

    Findiastuti, W.; Singgih, M. L.; Anityasari, M.

    2017-04-01

    Sustainable food security measures how a region provides food for its people without endangered the environment. In Indonesia, it was legally measured in Food Security and Vulnerability (FSVA). However, regard to sustainable food security policy, the measurement has not encompassed the environmental aspect. This will lead to lack of environmental aspect information for adjusting the next strategy. This study aimed to assess Sustainable Food security by encompassing both food security and environment aspect using systemic eco-efficiency. Given existing indicator of cereal production level, total emission as environment indicator was generated by constructing Causal Loop Diagram (CLD). Then, a stock-flow diagram was used to develop systemic simulation model. This model was demonstrated for Indonesian five provinces. The result showed there was difference between food security order with and without environmental aspect assessment.

  13. Guide for prioritizing power plant productivity improvement projects: handbook of availability improvement methodology

    Energy Technology Data Exchange (ETDEWEB)

    1981-09-15

    As part of its program to help improve electrical power plant productivity, the Department of Energy (DOE) has developed a methodology for evaluating productivity improvement projects. This handbook presents a simplified version of this methodology called the Availability Improvement Methodology (AIM), which provides a systematic approach for prioritizing plant improvement projects. Also included in this handbook is a description of data taking requirements necessary to support the AIM methodology, benefit/cost analysis, and root cause analysis for tracing persistent power plant problems. In applying the AIM methodology, utility engineers should be mindful that replacement power costs are frequently greater for forced outages than for planned outages. Equivalent availability includes both. A cost-effective ranking of alternative plant improvement projects must discern between those projects which will reduce forced outages and those which might reduce planned outages. As is the case with any analytical procedure, engineering judgement must be exercised with respect to results of purely mathematical calculations.

  14. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    Energy Technology Data Exchange (ETDEWEB)

    Tarifeño-Saldivia, Ariel, E-mail: atarifeno@cchen.cl, E-mail: atarisal@gmail.com; Pavez, Cristian; Soto, Leopoldo [Comisión Chilena de Energía Nuclear, Casilla 188-D, Santiago (Chile); Center for Research and Applications in Plasma Physics and Pulsed Power, P4, Santiago (Chile); Departamento de Ciencias Fisicas, Facultad de Ciencias Exactas, Universidad Andres Bello, Republica 220, Santiago (Chile); Mayer, Roberto E. [Instituto Balseiro and Centro Atómico Bariloche, Comisión Nacional de Energía Atómica and Universidad Nacional de Cuyo, San Carlos de Bariloche R8402AGP (Argentina)

    2014-01-15

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  15. Calibration methodology for proportional counters applied to yield measurements of a neutron burst.

    Science.gov (United States)

    Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo

    2014-01-01

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  16. [Improving practice and organisation of care: methodology for systematic reviews].

    Science.gov (United States)

    Zaugg, Vincent; Savoldelli, Virginie; Sabatier, Brigitte; Durieux, Pierre

    2014-01-01

    The number of intervention studies designed to improve quality of care is increasing exponentially, making it difficult to access all available information on a given subject. Systematic reviews are tools that provide health professionals with comprehensive and objective information. This article describes the main phases of a systematic review: formulating the research question, search and selection of studies, data extraction and analysis, assessment of the methodological quality of studies, and synthesis of the results. Interventions designed to improve professional practices and organisation of care have specific characteristics that determine the methodology of systematic reviews. For example, the often substantial heterogeneity between populations, organisations, and intervention settings among studies must be taken into account, which makes meta-analysis more difficult. Knowledge on specific features of systematic reviews designed to improve quality of care is essential to ensure a good review of the literature, or to evaluate the level of evidence of published systematic reviews.

  17. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  18. Improved rapid prototyping methodology for MPEG-4 IC development

    Science.gov (United States)

    Tang, Clive K. K.; Moseler, Kathy; Levi, Sami

    1998-12-01

    One important factor in deciding the success of a new consumer product or integrated circuit is minimized time-to- market. A rapid prototyping methodology that encompasses algorithm development in the hardware design phase will have great impact on reducing time-to-market. In this paper, a proven hardware design methodology and a novel top-down design methodology based on Frontier Design's DSP Station tool are described. The proven methodology was used during development of the MC149570 H.261/H.263 video codec manufactured by Motorola. This paper discusses an improvement to this method to create an integrated environment for both system and hardware development, thereby further reducing the time-to-market. The software tool chosen is DSP Station tool by Frontier Design. The rich features of DSP Station tool will be described and then it will be shown how these features may be useful in designing from algorithm to silicon. How this methodology may be used in the development of a new MPEG4 Video Communication ASIC will be outlined. A brief comparison with a popular tool, Signal Processing WorkSystem tool by Cadence, will also be given.

  19. Measuring the effect of improvement in methodological techniques on data collection in the Gharbiah population-based cancer registry in Egypt: Implications for other Low- and Middle-Income Countries.

    Science.gov (United States)

    Smith, Brittney L; Ramadan, Mohamed; Corley, Brittany; Hablas, Ahmed; Seifeldein, Ibrahim A; Soliman, Amr S

    2015-12-01

    The purpose of this study was to describe and quantify procedures and methods that maximized the efficiency of the Gharbiah Cancer Registry (GPCR), the only population-based cancer registry in Egypt. The procedures and measures included a locally-developed software program to translate names from Arabic to English, a new national ID number for demographic and occupational information, and linkage of cancer cases to new electronic mortality records of the Ministry of Health. Data was compiled from the 34,058 cases from the registry for the years 1999-2007. Cases and registry variables about demographic and clinical information were reviewed by year to assess trends associated with each new method or procedure during the study period. The introduction of the name translation software in conjunction with other demographic variables increased the identification of detected duplicates from 23.4% to 78.1%. Use of the national ID increased the proportion of cases with occupation information from 27% to 89%. Records with complete mortality information increased from 18% to 43%. Proportion of cases that came from death certificate only, decreased from 9.8% to 4.7%. Overall, the study revealed that introducing and utilizing local and culture-specific methodological changes, software, and electronic non-cancer databases had a significant impact on data quality and completeness. This study may have translational implications for improving the quality of cancer registries in LMICs considering the emerging advances in electronic databases and utilization of health software and computerization of data.

  20. DEVELOPMENT OF PETROLUEM RESIDUA SOLUBILITY MEASUREMENT METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Per Redelius

    2006-03-01

    In the present study an existing spectrophotometry system was upgraded to provide high-resolution ultraviolet (UV), visible (Vis), and near infrared (NIR) analyses of test solutions to measure the relative solubilities of petroleum residua dissolved in eighteen test solvents. Test solutions were prepared by dissolving ten percent petroleum residue in a given test solvent, agitating the mixture, followed by filtration and/or centrifugation to remove insoluble materials. These solutions were finally diluted with a good solvent resulting in a supernatant solution that was analyzed by spectrophotometry to quantify the degree of dissolution of a particular residue in the suite of test solvents that were selected. Results obtained from this approach were compared with spot-test data (to be discussed) obtained from the cosponsor.

  1. Application of Bow-tie methodology to improve patient safety.

    Science.gov (United States)

    Abdi, Zhaleh; Ravaghi, Hamid; Abbasi, Mohsen; Delgoshaei, Bahram; Esfandiari, Somayeh

    2016-05-09

    Purpose - The purpose of this paper is to apply Bow-tie methodology, a proactive risk assessment technique based on systemic approach, for prospective analysis of the risks threatening patient safety in intensive care unit (ICU). Design/methodology/approach - Bow-tie methodology was used to manage clinical risks threatening patient safety by a multidisciplinary team in the ICU. The Bow-tie analysis was conducted on incidents related to high-alert medications, ventilator associated pneumonia, catheter-related blood stream infection, urinary tract infection, and unwanted extubation. Findings - In total, 48 potential adverse events were analysed. The causal factors were identified and classified into relevant categories. The number and effectiveness of existing preventive and protective barriers were examined for each potential adverse event. The adverse events were evaluated according to the risk criteria and a set of interventions were proposed with the aim of improving the existing barriers or implementing new barriers. A number of recommendations were implemented in the ICU, while considering their feasibility. Originality/value - The application of Bow-tie methodology led to practical recommendations to eliminate or control the hazards identified. It also contributed to better understanding of hazard prevention and protection required for safe operations in clinical settings.

  2. QUALITY IMPROVEMENT IN MULTIRESPONSE EXPERIMENTS THROUGH ROBUST DESIGN METHODOLOGY

    Directory of Open Access Journals (Sweden)

    M. Shilpa

    2012-06-01

    Full Text Available Robust design methodology aims at reducing the variability in the product performance in the presence of noise factors. Experiments involving simultaneous optimization of more than one quality characteristic are known as multiresponse experiments which are used in the development and improvement of industrial processes and products. In this paper, robust design methodology is applied to optimize the process parameters during a particular operation of rotary driving shaft manufacturing process. The three important quality characteristics of the shaft considered here are of type Nominal-the-best, Smaller-the-better and Fraction defective. Simultaneous optimization of these responses is carried out by identifying the control parameters and conducting the experimentation using L9 orthogonal array.

  3. A methodology to measure the degre of managerial innovation

    Directory of Open Access Journals (Sweden)

    Mustafa Batuhan Ayhan

    2014-01-01

    Full Text Available Purpose: The main objective of this study is to introduce the concept of managerial innovation and to propose a quantitative methodology to measure the degree of managerial innovation capability by analyzing the evolution of the techniques used for management functions.Design/methodology/approach: The methodology mainly focuses on the different techniques used for each management functions namely; Planning, Organizing, Leading, Controlling and Coordinating. These functions are studied and the different techniques used for them are listed. Since the techniques used for these management functions evolve in time due to technological and social changes, a methodology is required to measure the degree of managerial innovation capability. This competency is measured through an analysis performed to point out which techniques used for each of these functions.Findings: To check the validity and applicability of this methodology, it is implemented to a manufacturing company. Depending on the results of the implementation, enhancements are suggested to the company for each function to survive in the changing managerial conditionsResearch limitations/implications: The primary limitation of this study is the implementation area. Although the study is implemented in just a single manufacturing company, it is welcomed to apply the same methodology to measure the managerial innovation capabilities of other manufacturing companies. Moreover, the model is ready to be adapted to different sectors although it is mainly prepared for manufacturing sector.Originality/value: Although innovation management is widely studied, managerial innovation is a new concept and introduced to measure the capability to challenge the changes occur in managerial functions. As a brief this methodology aims to be a pioneer in the field of managerial innovation regarding the evolution of management functions. Therefore it is expected to lead more studies to inspect the progress of

  4. Measuring Bone Metabolism with Fluoride PET: Methodological Considerations.

    Science.gov (United States)

    Apostolova, Ivayla; Brenner, Winfried

    2010-07-01

    In recent years the more widespread availability of PET systems and the development of hybrid PET/computed tomography (CT) imaging, allowing improved morphologic characterization of sites with increased tracer uptake, have improved the accuracy of diagnosis and strengthened the role of 18F-fluoride PET for quantitative assessment of bone pathology. This article reviews the role of 18F-fluoride PET in the skeleton, with a focus on (1) the underlying physiologic and pathophysiological processes of different conditions of bone metabolism and (2) methodological aspects of quantitative measurement of 18F-fluoride kinetics. Recent comparative studies have demonstrated that 18F-fluoride PET and, to an even greater extent, PET/CT are more accurate than 99mTc-bisphosphonate single-photon emission CT for the identification of malignant and benign lesions of the skeleton. Quantitative 18F-flouride PET has been shown valuable for direct non-invasive assessment of bone metabolism and monitoring response to therapy.

  5. Innovative methodologies and technologies for thermal energy release measurement.

    Science.gov (United States)

    Marotta, Enrica; Peluso, Rosario; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Chiodini, Giovanni; Mangiacapra, Annarita; Petrillo, Zaccaria; Sansivero, Fabio; Vilardo, Giuseppe; Marfe, Barbara

    2016-04-01

    Volcanoes exchange heat, gases and other fluids between the interrior of the Earth and its atmosphere influencing processes both at the surface and above it. This work is devoted to improve the knowledge on the parameters that control the anomalies in heat flux and chemical species emissions associated with the diffuse degassing processes of volcanic and hydrothermal zones. We are studying and developing innovative medium range remote sensing technologies to measure the variations through time of heat flux and chemical emissions in order to boost the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The current methodologies used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. Remote sensing of these parameters will allow for measurements faster than already accredited methods therefore it will be both more effective and efficient in case of emergency and it will be used to make quick routine monitoring. We are currently developing a method based on drone-born IR cameras to measure the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. The use of flying drones will allow to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature at distance in the order of hundreds of meters. Further development of remote sensing will be done through the use, on flying drones, of multispectral and/or iperspectral sensors, UV scanners in order to be able to detect the amount of chemical species released in the athmosphere.

  6. Proposing methodology pattern for measuring public value of IT projects

    Directory of Open Access Journals (Sweden)

    Dinko Kancijan

    2011-06-01

    Full Text Available The assessment of the acceptability and the value of IT projects in the public sector, especially when the projects feature the qualitative value along with the monetary one, is a complex problem. There are certain methodologies in the world though that help various organizations in decision-making process when projects are being chosen. The paper surveys the three IT projects public value assessing methodologies: the American Value Measuring Methodology, the French MAREVA, and the German WiBe. A comparison of the approaches to solving the problem of assessing public value of IT projects was thus made. The Analytic Hierarchy Process – a method of a multicriteria analysis of alternatives-was briefly presented. By the use of the Hierarchy Criteria Model in the way that respects all the basic characteristics that a methodology of the kind should feature in accordance with Gartner, a proposition of an IT project public value measurement methodology pattern was presented. The selection of projects with little value contribution if compared to the existing situation, the assessment of the acceptability of risk through the hierarchy structure of the value of a project and the aggregation of the value of separate PVIT dimensions is a contribution related to the methodology patterns that were surveyed.

  7. Bootstrapping Q Methodology to Improve the Understanding of Human Perspectives.

    Science.gov (United States)

    Zabala, Aiora; Pascual, Unai

    2016-01-01

    Q is a semi-qualitative methodology to identify typologies of perspectives. It is appropriate to address questions concerning diverse viewpoints, plurality of discourses, or participation processes across disciplines. Perspectives are interpreted based on rankings of a set of statements. These rankings are analysed using multivariate data reduction techniques in order to find similarities between respondents. Discussing the analytical process and looking for progress in Q methodology is becoming increasingly relevant. While its use is growing in social, health and environmental studies, the analytical process has received little attention in the last decades and it has not benefited from recent statistical and computational advances. Specifically, the standard procedure provides overall and arguably simplistic variability measures for perspectives and none of these measures are associated to individual statements, on which the interpretation is based. This paper presents an innovative approach of bootstrapping Q to obtain additional and more detailed measures of variability, which helps researchers understand better their data and the perspectives therein. This approach provides measures of variability that are specific to each statement and perspective, and additional measures that indicate the degree of certainty with which each respondent relates to each perspective. This supplementary information may add or subtract strength to particular arguments used to describe the perspectives. We illustrate and show the usefulness of this approach with an empirical example. The paper provides full details for other researchers to implement the bootstrap in Q studies with any data collection design.

  8. A fuzzy QFD methodology to improve logistics service

    Directory of Open Access Journals (Sweden)

    Siamak Noori

    2014-06-01

    Full Text Available Customer service is increasingly being recognized as a source of competitive advantage. The keys to provide effective customer service are determining the customer needs, accurately, and meeting and exceeding the needs in a consistent manner. Companies should adapt a strategic, proactive focus on customer service based on understanding logistic processes and designing the logistics system to meet their needs. This paper proposes an approach based on the quality function deployment (QFD, for ranking strategic actions to improve logistics service. The paper addresses the issue of how to deploy the house of quality (HOQ to effectively and efficiently improve logistics processes and thus customer satisfaction. For data collection, fuzzy logic is used to deal with the ill-defined nature of the qualitative linguistic judgments required in the proposed HOQ. The methodology has been tested by means of a real case application, which refers to an Iranian company operating in the manufacturing industry.

  9. Reservoir continuous process improvement six sigma methodology implementation

    Energy Technology Data Exchange (ETDEWEB)

    Wannamaker, A.L.

    1996-12-01

    The six sigma methodology adopted by AlliedSignal Inc. for implementing continuous improvement activity was applied to a new manufacturing assignment for Federal Manufacturing & Technologies (FM&T). The responsibility for reservoir development/production was transferred from Rocky Flats to FM&T. Pressure vessel fabrication was new to this facility. No fabrication history for this type of product existed in-house. Statistical tools such as process mapping, failure mode and effects analysis, and design of experiments were used to define and fully characterize the machine processes to be used in reservoir production. Continuous improvement with regard to operating efficiencies and product quality is an ongoing activity at FM&T.

  10. Methodologies for Improved Tag Cloud Generation with Clustering

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2012-01-01

    coverage and overlap. We study several clustering algorithms to generate tag clouds. We show that by extending cloud generation based on tag popularity with clustering we slightly improve coverage. We also show that if the cloud is generated by clustering independently of the tag popularity baseline we......Tag clouds are useful means for navigation in the social web systems. Usually the systems implement the tag cloud generation based on tag popularity which is not always the best method. In this paper we propose methodologies on how to combine clustering into the tag cloud generation to improve...... better on bibsonomy due to its specific focus. The best performing is the hierarchical clustering....

  11. The New Embedded System Design Methodology For Improving Design Process Performance

    CERN Document Server

    Abdurohman, Maman; Sutikno, Sarwono; Sasongko, Arif

    2010-01-01

    Time-to-market pressure and productivity gap force vendors and researchers to improve embedded system design methodology. Current used design method, Register Transfer Level (RTL), is no longer be adequate to comply with embedded system design necessity. It needs a new methodology for facing the lack of RTL. In this paper, a new methodology of hardware embedded system modeling process is designed for improving design process performance using Transaction Level Modeling (TLM). TLM is a higher abstraction design concept model above RTL model. Parameters measured include design process time and accuracy of design. For implementing RTL model used Avalon and Wishbone buses, both are System on Chip bus. Performance improvement measured by comparing TLM and RTL model process. The experiment results show performance improvements for Avalon RTL using new design methodology are 1,03 for 3-tiers, 1,47 for 4-tiers and 1,69 for 5-tiers. Performance improvements for Wishbone RTL are 1,12 for 3-tiers, 1,17 for 4-tiers and 1...

  12. Lean methodology for performance improvement in the trauma discharge process.

    Science.gov (United States)

    O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey

    2014-07-01

    High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p lean process lasted 8 months, and three areas for new improvement were identified: (1) the off-unit patients; (2) patients with length of stay more than 15 days contribute disproportionately to length of stay; and (3) miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.

  13. Distributed collaborative team effectiveness: measurement and process improvement

    Science.gov (United States)

    Wheeler, R.; Hihn, J.; Wilkinson, B.

    2002-01-01

    This paper describes a measurement methodology developed for assessing the readiness, and identifying opportunities for improving the effectiveness, of distributed collaborative design teams preparing to conduct a coccurent design session.

  14. Advanced quantitative measurement methodology in physics education research

    Science.gov (United States)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three

  15. A new methodology to measure the running biomechanics of amputees.

    Science.gov (United States)

    Wilson, James Richard; Asfour, Shihab; Abdelrahman, Khaled Zakaria; Gailey, Robert

    2009-09-01

    We present a new methodology to measure the running biomechanics of amputees. This methodology combines the use of a spring-mass model and symmetry index, two standard techniques in biomechanics literature, but not yet used in concert to evaluate amputee biomechanics. The methodology was examined in the context of a pilot study to examine two transtibial amputee sprinters and showed biomechanically quantifiable changes for small adjustments in prosthetic prescription. Vertical ground reaction forces were measured in several trials for two transtibial amputees running at constant speed. A spring-mass model was used in conjunction with a symmetry index to observe the effect of varying prosthetic height and stiffness on running biomechanics. All spring-mass variables were significantly affected by changes in prosthetic prescription among the two subjects tested (p < 0.05). When prosthetic height was changed, both subjects showed significant differences, in Deltay(max), Deltal and contact time (t(c)) on the prosthetic limb and in k(vert) and k(leg) on the sound limb. The symmetry indices calculated for spring-mass variables were all significantly affected due to changes in prosthetic prescription for the male subject and all but the peak force (F(peak)) for the female subject. This methodology is a straight-forward tool for evaluating the effect of changes to prosthetic prescription.

  16. Measurement of Overall Performance Effectiveness in Setup Improvement

    Directory of Open Access Journals (Sweden)

    Shye-Nee Low

    2014-01-01

    Full Text Available This study aimed to improve the setup process of injection molding machines by using the developed setup improvement methodology. Overall performance effectiveness (OPE was used to evaluate the setup improvement. A case study was tested on the application of the developed setup improvement methodology. A 50.1% reduction in setup time was attained by the developed methodology, and significant time savings were achieved with minimum investment. Comparisons between before and after improvement implementation were conducted through OPE to verify the improvement. In terms of OPE, the setup performance in the case study considered an acceptable value of 60.45%. The setup process performance of the developed setup improvement methodology was judged in terms of effectiveness. Results therefore indicate that OPE measurement is an effective way to analyze the efficiency of a single setup process.

  17. Improved Methodology for Parameter Inference in Nonlinear, Hydrologic Regression Models

    Science.gov (United States)

    Bates, Bryson C.

    1992-01-01

    A new method is developed for the construction of reliable marginal confidence intervals and joint confidence regions for the parameters of nonlinear, hydrologic regression models. A parameter power transformation is combined with measures of the asymptotic bias and asymptotic skewness of maximum likelihood estimators to determine the transformation constants which cause the bias or skewness to vanish. These optimized constants are used to construct confidence intervals and regions for the transformed model parameters using linear regression theory. The resulting confidence intervals and regions can be easily mapped into the original parameter space to give close approximations to likelihood method confidence intervals and regions for the model parameters. Unlike many other approaches to parameter transformation, the procedure does not use a grid search to find the optimal transformation constants. An example involving the fitting of the Michaelis-Menten model to velocity-discharge data from an Australian gauging station is used to illustrate the usefulness of the methodology.

  18. Using Lean Six Sigma Methodology to Improve Quality of the Anesthesia Supply Chain in a Pediatric Hospital.

    Science.gov (United States)

    Roberts, Renée J; Wilson, Ashley E; Quezado, Zenaide

    2017-03-01

    Six Sigma and Lean methodologies are effective quality improvement tools in many health care settings. We applied the DMAIC methodology (define, measure, analyze, improve, control) to address deficiencies in our pediatric anesthesia supply chain. We defined supply chain problems by mapping existing processes and soliciting comments from those involved. We used daily distance walked by anesthesia technicians and number of callouts for missing supplies as measurements that we analyzed before and after implementing improvements (anesthesia cart redesign). We showed improvement in the metrics after those interventions were implemented, and those improvements were sustained and thus controlled 1 year after implementation.

  19. Methodology matters: measuring urban spatial development using alternative methods

    OpenAIRE

    Daniel E Orenstein; Amnon Frenkel; Faris Jahshan

    2014-01-01

    The effectiveness of policies implemented to prevent urban sprawl has been a contentious issue among scholars and practitioners for at least two decades. While disputes range from the ideological to the empirical, regardless of the subject of dispute, participants must bring forth reliable data to buttress their claims. In this study we discuss several sources of complexity inherent in measuring sprawl. We then exhibit how methodological decisions can lead to disparate results regarding the q...

  20. A symbolic methodology to improve disassembly process design.

    Science.gov (United States)

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  1. Bone drilling methodology and tool based on position measurements.

    Science.gov (United States)

    Díaz, Iñaki; Gil, Jorge Juan; Louredo, Marcos

    2013-11-01

    Bone drilling, despite being a very common procedure in hospitals around the world, becomes very challenging when performed close to organs such as the cochlea or when depth control is critical for avoiding damage to surrounding tissue. To date, several mechatronic prototypes have been proposed to assist surgeons by automatically detecting bone layer transitions and breakthroughs. However, none of them is currently accurate enough to be part of the surgeon's standard equipment. The present paper shows a test bench specially designed to evaluate prior methodologies and analyze their drawbacks. Afterward, a new layer detection methodology with improved performance is described and tested. Finally, the prototype of a portable mechatronic bone drill that takes advantage of the proposed detection algorithm is presented.

  2. THE QUALITY IMPROVEMENT OF PRIMER PACKAGING PROCESS USING SIX SIGMA METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Prima Ditahardiyani

    2008-01-01

    Full Text Available The implementation of Six Sigma has become a common theme in many organizations. This paper presents the Six Sigma methodology and its implementation in a primer packaging process of Cranberry drink. DMAIC (Define, Measure, Analyze, Improve and Control approach is used to analyze and to improve the primer packaging process, which have high variability and defects output. After the improvement, the results showed that there was an increasing sigma level. However, it is not significantly and has not achieved the world standard quality, yet. Therefore, the implementation of Six Sigma in primer packaging process of Cranberry drink still has a room for doing a further research.

  3. Lean methodology: an evidence-based practice approach for healthcare improvement.

    Science.gov (United States)

    Johnson, Pauline M; Patterson, Claire J; OʼConnell, Mary P

    2013-12-10

    Lean methodology, an evidence-based practice approach adopted from Toyota, is grounded on the pillars of respect for people and continuous improvement. This article describes the use of Lean methodology to improve healthcare outcomes for patients with community-acquired pneumonia. Nurse practitioners and other clinicians should be knowledgeable about this methodology and become leaders in Lean transformation.

  4. Methodological proposal for the definition of improvement strategies in logistics of SME

    National Research Council Canada - National Science Library

    Yeimy Liseth Becerra

    2014-01-01

    A methodological proposal for defining strategies of improvement in logistics of SMEs is presented as a means to fulfill a specific objective of the project Methodological design on storage logistics...

  5. Spectral reflectance measurement methodologies for TUZ Golu field campaign

    CSIR Research Space (South Africa)

    Boucher, Y

    2011-07-01

    Full Text Available MEASUREMENT METHODOLOGIES FOR TUZ GOLU FIELD CAMPAIGN Y. Bouchera, F. Viallefonta, A. Deadmanb, N. Foxb , I. Behnertb, D. Griffithc, P. Harrisb, D. Helderd, E. Knaepse, L. Leighd, Y. Lif, H. Ozeng, F. Ponzonih, S. Sterckxe a Onera - The French Aerospace... A uncertainty [4]. Thus, the variation of the reflectance between the different points is a combination of the variation at small scale and at the scale of the sampling grid, typically between 20 m and 40 m. This strategy has been chosen by Onera...

  6. Measurements of Gluconeogenesis and Glycogenolysis: A Methodological Review.

    Science.gov (United States)

    Chung, Stephanie T; Chacko, Shaji K; Sunehag, Agneta L; Haymond, Morey W

    2015-12-01

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to determine gluconeogenesis is by measuring the incorporation of deuterium from the body water pool into newly formed glucose. However, several techniques using radioactive and stable-labeled isotopes have been used to quantitate the contribution and regulation of gluconeogenesis in humans. Each method has its advantages, methodological assumptions, and set of propagated errors. In this review, we examine the strengths and weaknesses of the most commonly used stable isotopes methods to measure gluconeogenesis in vivo. We discuss the advantages and limitations of each method and summarize the applicability of these measurements in understanding normal and pathophysiological conditions.

  7. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    Science.gov (United States)

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2017-02-15

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  8. Systematic Review of the Application of Lean and Six Sigma Quality Improvement Methodologies in Radiology.

    Science.gov (United States)

    Amaratunga, Thelina; Dobranowski, Julian

    2016-09-01

    Preventable yet clinically significant rates of medical error remain systemic, while health care spending is at a historic high. Industry-based quality improvement (QI) methodologies show potential for utility in health care and radiology because they use an empirical approach to reduce variability and improve workflow. The aim of this review was to systematically assess the literature with regard to the use and efficacy of Lean and Six Sigma (the most popular of the industrial QI methodologies) within radiology. MEDLINE, the Allied & Complementary Medicine Database, Embase Classic + Embase, Health and Psychosocial Instruments, and the Ovid HealthStar database, alongside the Cochrane Library databases, were searched on June 2015. Empirical studies in peer-reviewed journals were included if they assessed the use of Lean, Six Sigma, or Lean Six Sigma with regard to their ability to improve a variety of quality metrics in a radiology-centered clinical setting. Of the 278 articles returned, 23 studies were suitable for inclusion. Of these, 10 assessed Six Sigma, 7 assessed Lean, and 6 assessed Lean Six Sigma. The diverse range of measured outcomes can be organized into 7 common aims: cost savings, reducing appointment wait time, reducing in-department wait time, increasing patient volume, reducing cycle time, reducing defects, and increasing staff and patient safety and satisfaction. All of the included studies demonstrated improvements across a variety of outcomes. However, there were high rates of systematic bias and imprecision as per the Grading of Recommendations Assessment, Development and Evaluation guidelines. Lean and Six Sigma QI methodologies have the potential to reduce error and costs and improve quality within radiology. However, there is a pressing need to conduct high-quality studies in order to realize the true potential of these QI methodologies in health care and radiology. Recommendations on how to improve the quality of the literature are proposed

  9. Toward a new methodology for measuring the threshold Shields number

    Science.gov (United States)

    Rousseau, Gauthier; Dhont, Blaise; Ancey, Christophe

    2016-04-01

    A number of bedload transport equations involve the threshold Shields number (corresponding to the threshold of incipient motion for particles resting on the streambed). Different methods have been developed for determining this threshold Shields number; they usually assume that the initial streambed is plane prior to sediment transport. Yet, there are many instances in real-world scenarios, in which the initial streambed is not free of bed forms. We are interested in developing a new methodology for determining the threshold of incipient motion in gravel-bed streams in which smooth bed forms (e.g., anti-dunes) develop. Experiments were conducted in a 10-cm wide, 2.5-m long flume, whose initial inclination was 3%. Flows were supercritical and fully turbulent. The flume was supplied with water and sediment at fixed rates. As bed forms developed and migrated, and sediment transport rates exhibited wide fluctuations, measurements had to be taken over long times (typically 10 hr). Using a high-speed camera, we recorded the instantaneous bed load transport rate at the outlet of the flume by taking top-view images. In parallel, we measured the evolution of the bed slope, water depth, and shear stress by filming through a lateral window of the flume. These measurements allowed for the estimation of the space and time-averaged slope, from which we deduced the space and time-averaged Shields number under incipient bed load transport conditions. In our experiments, the threshold Shields number was strongly dependent on streambed morphology. Experiments are under way to determine whether taking the space and time average of incipient motion experiments leads to a more robust definition of the threshold Shields number. If so, this new methodology will perform better than existing approaches at measuring the threshold Shields number.

  10. Methodological improvements of geoid modelling for the Austrian geoid computation

    Science.gov (United States)

    Kühtreiber, Norbert; Pail, Roland; Wiesenhofer, Bernadette; Pock, Christian; Wirnsberger, Harald; Hofmann-Wellenhof, Bernhard; Ullrich, Christian; Höggerl, Norbert; Ruess, Diethard; Imrek, Erich

    2010-05-01

    The geoid computation method of Least Squares Collocation (LSC) is usually applied in connection with the remove-restore technique. The basic idea is to remove, before applying LSC, not only the long-wavelength gravity field effect represented by the global gravity field model, but also the high-frequency signals, which are mainly related to topography, by applying a topographic-isostatic reduction. In the current Austrian geoid solution, an Airy-Heiskanen model with a standard density of 2670 kg/m3 was used. A close investigation of the absolute error structure of this solution reveals some correlations with topography, which may be explained with these simplified assumptions. On parameter of the remove-restore process to be investigated in this work is the depth of the reference surface of isostatic compensation, the Mohorovicic discontinuity (Moho). The recently compiled European plate Moho depth model, which is based on 3D-seismic tomography and other geophysical measurements, is used instead of the reference surface derived from the Airy-Heiskanen isostatic model. Additionally, the use of of the standard density of 2670 kg/m3 is replaced by a laterally variable (surface) density model. The impact of these two significant modifications of the geophysical conception of the remove-restore procedure on the Austrian geoid solution is investigated and analyzed in detail. In the current Austrian geoid solution the above described remove-restore concept was used in a first step to derive a pure gravimetric geoid and predicting the geoid height for 161 GPS/levelling points. The difference between measured and predicted geoid heights shows a long-wavelength structure. These systematic distortions are commonly attributed to inconsistencies in the datum, distortions of the orthometric height system, and systematic GPS errors. In order to cope with this systematic term, a polynomial of degree 3 was fitted to the difference of predicted geoid heights and GPS

  11. Methodological aspects of EEG and Body dynamics measurements during motion.

    Directory of Open Access Journals (Sweden)

    Pedro eReis

    2014-03-01

    Full Text Available EEG involves recording, analysis, and interpretation of voltages recorded on the human scalp originating from brain grey matter. EEG is one of the favorite methods to study and understand processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements, that are performed in response to the environment. However, there are methodological difficulties when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions of how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determination of real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks.

  12. Coupling new technologies and methodologies for performance improvement.

    Science.gov (United States)

    Monge, Paul

    2007-01-01

    Radiology is a pivotal part of the patient's experience within a healthcare organization and has traditionally embraced new technologies. It is now time to embrace new management methodologies. With the changing winds in reimbursement, activity-based methods (ABC and ABM) will assist us to maximize our resources, reduce costs, and increase our efficiencies to maintain the quality of care. We have embraced new technologies, but we have implemented them on top of old processes. Without embracing new methodologies we may never maximize our new technology.

  13. Measuring user experience in digital gaming: theoretical and methodological issues

    Science.gov (United States)

    Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte

    2007-01-01

    There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.

  14. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    Science.gov (United States)

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  15. An improved methodology for precise geoid/quasigeoid modelling

    Science.gov (United States)

    Nesvadba, Otakar; Holota, Petr

    2016-04-01

    The paper describes recent development of the computational procedure useful for precise local quasigeoid modelling. The overall methodology is primarily based on a solution of the so-called gravimetric boundary value problem for an ellipsoidal domain (exterior to an oblate spheroid), which means that gravity disturbances on the ellipsoid are used in quality of input data. The problem of a difference between the Earth's topography and the chosen ellipsoidal surface is solved iteratively, by analytical continuation of the gravity disturbances to the computational ellipsoid. The methodology covers an interpolation technique of the discrete gravity data, which, considering a priori adopted covariance function, provides the best linear unbiased estimate of the respective quantity, numerical integration technique developed on the surface of ellipsoid in the spectral domain, an iterative procedure of analytical continuation in ellipsoidal coordinates, remove and restore of the atmospheric masses, an estimate of the far-zones contribution (in a case of regional data coverage) and the restore step of the obtained disturbing gravity potential to the target height anomaly. All the computational steps of the procedure are modest in the consumption of compute resources, thus the methodology can be used on a common personal computer, free of any accuracy or resolution penalty. Finally, the performance of the developed methodology is demonstrated on the real-case examples related to the territories of France (Auvergne regional quasigeoid) and the Czech Republic.

  16. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  17. Toward Improved Understanding of Food Security: A Methodological Examination Based in Rural South Africa

    Science.gov (United States)

    Kirkland, Tracy; Kemp, Robert S.; Hunter, Lori M.; Twine, Wayne S.

    2014-01-01

    Accurate measurement of household food security is essential to generate adequate information on the proportion of households experiencing food insecurity, especially in areas or regions vulnerable to food shortages and famine. This manuscript offers a methodological examination of three commonly used indicators of household food security – experience of hunger, dietary diversity, and coping strategies. Making use of data from the Agincourt Health and Demographic Surveillance Site in rural South Africa, we examine the association between the indicators themselves to improve understanding of the different insight offered by each food security “lens.” We also examine how the choice of indicator shapes the profile of vulnerable households, with results suggesting that dietary diversity scores may not adequately capture broader food insecurity. Concluding discussion explores programmatic and policy implications as related to methodological choices. PMID:25414598

  18. Toward Improved Understanding of Food Security: A Methodological Examination Based in Rural South Africa.

    Science.gov (United States)

    Kirkland, Tracy; Kemp, Robert S; Hunter, Lori M; Twine, Wayne S

    2013-03-01

    Accurate measurement of household food security is essential to generate adequate information on the proportion of households experiencing food insecurity, especially in areas or regions vulnerable to food shortages and famine. This manuscript offers a methodological examination of three commonly used indicators of household food security - experience of hunger, dietary diversity, and coping strategies. Making use of data from the Agincourt Health and Demographic Surveillance Site in rural South Africa, we examine the association between the indicators themselves to improve understanding of the different insight offered by each food security "lens." We also examine how the choice of indicator shapes the profile of vulnerable households, with results suggesting that dietary diversity scores may not adequately capture broader food insecurity. Concluding discussion explores programmatic and policy implications as related to methodological choices.

  19. Difficult to measure constructs: conceptual and methodological issues concerning participation and environmental factors.

    Science.gov (United States)

    Whiteneck, Gale; Dijkers, Marcel P

    2009-11-01

    For rehabilitation and disability research, participation and environment are 2 crucial constructs that have been placed center stage by the International Classification of Functioning, Disability and Health (ICF). However, neither construct is adequately conceptualized by the ICF, and both are difficult to measure. This article addresses conceptual and methodologic issues related to these ICF constructs, and recommends an improved distinction between activities and participation, as well as elaboration of environment. A division of the combined ICF categories for activity and participation into 2 separate taxonomies is proposed to guide future research. The issue of measuring participation from objective and subjective perspectives is examined, and maintaining these distinct conceptual domains in the measurement of participation is recommended. The methodological issues contributing to the difficulty of measuring participation are discussed, including potential dimensionality, alternative metrics, and the appropriateness of various measurement models. For environment, the need for theory to focus research on those aspects of the environment that interact with individuals' impairments and functional limitations in affecting activities and participation is discussed, along with potential measurement models for those aspects. The limitations resulting from reliance on research participants as reporters on their own environment are set forth. Addressing these conceptual and methodological issues is required before the measurement of participation and environmental factors can advance and these important constructs can be used more effectively in rehabilitation and disability observational research and trials.

  20. ?HY-CHANGE?: AN HYBRID METHODOLOGY FOR CONTINUOUS PERFORMANCE IMPROVEMENT OF MANUFACTURING PROCESSES

    OpenAIRE

    Dassisti, Michele

    2010-01-01

    Abstract An hybrid methodology based on the joint recourse of Business Process An hybrid methodology for Continuous Performance Improvement (CPI) is presented, basically funded on the joint recourse of Business Process Reengineering (BPR) and Continuous Quality Improvement (CQI) principles and tools. The methodology (called HY-CHANGE) is conceived as a logical and technical support to the decision maker. It results in a number of recursive phases, where the rational and synchronous...

  1. Measurement of Quality of Life I. A Methodological Framework

    Directory of Open Access Journals (Sweden)

    Soren Ventegodt

    2003-01-01

    Full Text Available Despite the widespread acceptance of quality of life (QOL as the ideal guideline in healthcare and clinical research, serious conceptual and methodological problems continue to plague this area. In an attempt to remedy this situation, we propose seven criteria that a quality-of-life concept must meet to provide a sound basis for investigation by questionnaire. The seven criteria or desiderata are: (1 an explicit definition of quality of life; (2 a coherent philosophy of human life from which the definition is derived; (3 a theory that operationalizes the philosophy by specifying unambiguous, nonoverlapping, and jointly exhaustive questionnaire items; (4 response alternatives that permit a fraction-scale interpretation; (5 technical checks of reproducibility; (6 meaningfulness to investigators, respondents, and users; and (7 an overall aesthetic appeal of the questionnaire. These criteria have guided the design of a validated 5-item generic, global quality-of-life questionnaire (QOL5, and a validated 317-item generic, global quality-of-life questionnaire (SEQOL, administered to a well-documented birth cohort of 7,400 Danes born in 1959�1961, as well as to a reference sample of 2,500 Danes. Presented in outline, the underlying integrative quality-of-life (IQOL theory is a meta-theory. To illustrate the seven criteria at work, we show the extent to which they are satisfied by one of the eight component theories. Next, two sample results of our investigation are presented: satisfaction with one's sex life has the expected covariation with one's quality of life, and so does mother's smoking during pregnancy, albeit to a much smaller extent. It is concluded that the methodological framework presented has proved helpful in designing a questionnaire that is capable of yielding acceptably valid and reliable measurements of global and generic quality of life.

  2. Measuring persistence: A literature review focusing on methodological issues

    Energy Technology Data Exchange (ETDEWEB)

    Wolfe, A.K.; Brown, M.A.; Trumble, D.

    1995-03-01

    This literature review was conducted as part of a larger project to produce a handbook on the measurement of persistence. The past decade has marked the development of the concept of persistence and a growing recognition that the long-term impacts of demand-side management (DSM) programs warrant careful assessment. Although Increasing attention has been paid to the topic of persistence, no clear consensus has emerged either about its definition or about the methods most appropriate for its measurement and analysis. This project strives to fill that gap by reviewing the goals, terminology, and methods of past persistence studies. It was conducted from the perspective of a utility that seeks to acquire demand-side resources and is interested in their long-term durability; it was not conducted from the perspective of the individual consumer. Over 30 persistence studies, articles, and protocols were examined for this report. The review begins by discussing the underpinnings of persistence studies: namely, the definitions of persistence and the purposes of persistence studies. Then. it describes issues relevant to both the collection and analysis of data on the persistence of energy and demand savings. Findings from persistence studies also are summarized. Throughout the review, four studies are used repeatedly to illustrate different methodological and analytical approaches to persistence so that readers can track the data collection. data analysis, and findings of a set of comprehensive studies that represent alternative approaches.

  3. A patient-centered methodology that improves the accuracy of prognostic predictions in cancer.

    Directory of Open Access Journals (Sweden)

    Mohammed Kashani-Sabet

    Full Text Available Individualized approaches to prognosis are crucial to effective management of cancer patients. We developed a methodology to assign individualized 5-year disease-specific death probabilities to 1,222 patients with melanoma and to 1,225 patients with breast cancer. For each cancer, three risk subgroups were identified by stratifying patients according to initial stage, and prediction probabilities were generated based on the factors most closely related to 5-year disease-specific death. Separate subgroup probabilities were merged to form a single composite index, and its predictive efficacy was assessed by several measures, including the area (AUC under its receiver operating characteristic (ROC curve. The patient-centered methodology achieved an AUC of 0.867 in the prediction of 5-year disease-specific death, compared with 0.787 using the AJCC staging classification alone. When applied to breast cancer patients, it achieved an AUC of 0.907, compared with 0.802 using the AJCC staging classification alone. A prognostic algorithm produced from a randomly selected training subsample of 800 melanoma patients preserved 92.5% of its prognostic efficacy (as measured by AUC when the same algorithm was applied to a validation subsample containing the remaining patients. Finally, the tailored prognostic approach enhanced the identification of high-risk candidates for adjuvant therapy in melanoma. These results describe a novel patient-centered prognostic methodology with improved predictive efficacy when compared with AJCC stage alone in two distinct malignancies drawn from two separate populations.

  4. The Assessment Methodology RADAR – A Theoretical Approach of a Methodology for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence

    Directory of Open Access Journals (Sweden)

    Cristina Raluca Popescu

    2015-05-01

    Full Text Available In the paper “The Assessment Methodology RADAR – A Theoretical Approach of a Methodology for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodology RADAR that is designed to coordinate the efforts to improve the organizational processes in order to achieve excellence.

  5. Developing a lean measurement system to enhance process improvement

    Directory of Open Access Journals (Sweden)

    Lewis P.

    2013-01-01

    Full Text Available A key ingredient to underpin process improvement is a robust, reliable, repeatable measurement system. Process improvement activity needs to be supported by accurate and precise data because effective decision making, within process improvement activity, demands the use of “hard” data. One of the oldest and most established process improvement methods is Deming’s Plan-Do-Check-Act (PDCA cycle which is reliant on the check phase, a measurement activity where data is being gathered and evaluated. Recent expansions of the PDCA such as the Six-Sigma Define-Measure-Analyse-Improve-Control (DMAIC methodology place significant importance upon measurement. The DMAIC cycle incorporates the regimented requirement for the inclusion of measurement system analysis (MSA into the breakthrough strategy. The call for MSA within the DMAIC cycle is to provide the improvement activity with a robust measurement system that will ensure a pertinent level of data during any validation process. The Lean methodology is heavily centred on the removal of the seven Mudas (wastes from a manufacturing process: defects, overproduction, transportation, waiting, inventory, motion and processing. The application of lean, particularly within the manufacturing industry, has led to a perception that measurement is a waste within a manufacturing process because measurement processes identify defective products. The metrologists’ pursuit for measurement excellence could be construed as a hindrance by the “cost down” demands being perpetrated from the same organisation’s lean policy. So what possible benefits does enforcing the regimes of the lean and quality philosophies upon the measurement process have and how does this ultimately enhance the process improvement activity? The key fundamental to embed with any process improvement is the removal of waste. The process improvement techniques embedded within lean and quality concepts are extremely powerful practices in the

  6. Problems in Different Measuring and Assessment the Modulus of Deformation Using the Czech and German Methodologies

    Directory of Open Access Journals (Sweden)

    M. Lidmila

    2006-01-01

    Full Text Available Comparative laboratory and in-situ measurements were used to establish the relationships between the static moduli of deformation calculated under the ED methodology and the DB methodology. The measurements proved that the moduli of deformation determined in accordance with the two methodologies cannot be substituted for each other. 

  7. Methodology for the use of proportional counters in pulsed fast neutron yield measurements

    CERN Document Server

    Tarifeño-Saldivia, Ariel; Pavez, Cristian; Soto, Leopoldo

    2011-01-01

    This paper introduces in full detail a methodology for the measurement of neutron yield and the necessary efficiency calibration, to be applied to the intensity measurement of neutron bursts where individual neutrons are not resolved in time, for any given moderated neutron proportional counter array. The method allows efficiency calibration employing the detection neutrons arising from an isotopic neutron source. Full statistical study of the procedure is descripted, taking into account contributions arising from counting statistics, piling-up statistics of real detector pulse-height spectra and background fluctuations. The useful information is extracted from the net waveform area of the signal arising from the electric charge accumulated inside the detector tube. Improvement of detection limit is gained, therefore this detection system can be used in detection of low emission neutron pulsed sources with pulses of duration from nanoseconds to up. The application of the methodology to detection systems to be...

  8. Developing a methodology for sustainable production of improved ...

    African Journals Online (AJOL)

    Mo

    Serere Agricultural and Animal Production Research Institute (SAARI). ... method based on the three livestock improvement projects funded by COARD project where farmers were .... nucleus herds of Ankole, Nganda and Small East African.

  9. Aqueduct: a methodology to measure and communicate global water risks

    Science.gov (United States)

    Gassert, Francis; Reig, Paul

    2013-04-01

    The Aqueduct Water Risk Atlas (Aqueduct) is a publicly available, global database and interactive tool that maps indicators of water related risks for decision makers worldwide. Aqueduct makes use of the latest geo-statistical modeling techniques to compute a composite index and translate the most recently available hydrological data into practical information on water related risks for companies, investors, and governments alike. Twelve global indicators are grouped into a Water Risk Framework designed in response to the growing concerns from private sector actors around water scarcity, water quality, climate change, and increasing demand for freshwater. The Aqueduct framework organizes indicators into three categories of risk that bring together multiple dimensions of water related risk into comprehensive aggregated scores and includes indicators of water stress, variability in supply, storage, flood, drought, groundwater, water quality and social conflict, addressing both spatial and temporal variation in water hazards. Indicators are selected based on relevance to water users, availability and robustness of global data sources, and expert consultation, and are collected from existing datasets or derived from a Global Land Data Assimilation System (GLDAS) based integrated water balance model. Indicators are normalized using a threshold approach, and composite scores are computed using a linear aggregation scheme that allows for dynamic weighting to capture users' unique exposure to water hazards. By providing consistent scores across the globe, the Aqueduct Water Risk Atlas enables rapid comparison across diverse aspects of water risk. Companies can use this information to prioritize actions, investors to leverage financial interest to improve water management, and governments to engage with the private sector to seek solutions for more equitable and sustainable water governance. The Aqueduct Water Risk Atlas enables practical applications of scientific data

  10. Clock measurements to improve the geopotential determination

    Science.gov (United States)

    Lion, Guillaume; Panet, Isabelle; Delva, Pacôme; Wolf, Peter; Bize, Sébastien; Guerlin, Christine

    2017-04-01

    Comparisons between optical clocks with an accuracy and stability approaching the 10-18 in term of relative frequency shift are opening new perspectives for the direct determination of geopotential at a centimeter-level accuracy in geoid height. However, so far detailed quantitative estimates of the possible improvement in geoid determination when adding such clock measurements to existing data are lacking. In this context, the present work aims at evaluating the contribution of this new kind of direct measurements in determining the geopotential at high spatial resolution (10 km). We consider the Massif Central area, marked by smooth, moderate altitude mountains and volcanic plateaus leading to variations of the gravitational field over a range of spatial scales. In such type of region, the scarcity of gravity data is an important limitation in deriving accurate high resolution geopotential models. We summarize our methodology to assess the contribution of clock data in the geopotential recovery, in combination with ground gravity measurements. We sample synthetic gravity and disturbing potential data from a spherical harmonics geopotential model, and a topography model, up to 10 km resolution; we also build a potential control grid. From the synthetic data, we estimate the disturbing potential by least-squares collocation. Finally, we assess the quality of the reconstructed potential by comparing it to that of the control grid. We show that adding only a few clock data reduces the reconstruction bias significantly and improves the standard deviation by a factor 3. We discuss the role of different parameters, such as the effect of the data coverage and data quality on these results, the trade-off between the measurement noise level and the number of data, and the optimization of the clock data network.

  11. A methodology for measurements of basic parameters in a xDSL system

    Science.gov (United States)

    Brito, Edson, Jr.; de Souza, Lamartine V.; Patrício, Éder T.; Castro, Agostinho L. S.; P. dos S. Cavalcante, Gervásio; Costa, João Crisóstomo W. A.; Ericson, Klas; Lindqvist, Fredrik; Rius I Riu, Jaume

    2006-10-01

    In order to qualify a subscriber loops for xDSL transmission, basic parameters like transfer function, scattering parameter S11 and characteristic impedance should be known. The aim of this paper is to present a test methodology for measurements of these basic parameters. The characteristic impedance is measured by open/short method and it is compared with the terminated measurement method defined in IEC (International Electrotechnical Commission) 611156-1. Transfer function and scattering parameter S11 of DSL loop are also measured on a real cable. The methodology is based on measurements of a 0.4 mm, 10 pairs, balanced twisted-pair cable of 1400 m of length. In order to improve the analysis of results, we compared the measurements from real cable with results from wireline simulators. The measurement of parameters of xDSL copper loop is done in an infrastructure set up in the LABIT (Technological Innovation in Telecommunications Lab) at UFPA (Federal University of Para), that consist of a wireline simulators, a precision impedance analyzer, and a network analyzer. The results show a difference between the measurements performed with real cables and wireline simulators for transfer function parameter. Characteristic impedance obtained by both methods presented quite similar results.

  12. A Three-Step Methodology to Improve Domestic Energy Efficiency

    NARCIS (Netherlands)

    Molderink, Albert; Bakker, Vincent; Bosman, M.G.C.; Hurink, Johann L.; Smit, Gerardus Johannes Maria

    2010-01-01

    Increasing energy prices and the greenhouse effect lead to more awareness of energy efficiency of electricity supply. During the last years, a lot of technologies have been developed to improve this efficiency. Next to large scale technologies such as windturbine parks, domestic technologies are dev

  13. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  14. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  15. METHODOLOGICAL ESSENTIAL PRINCIPLES OF REGIONAL INVESTMENT ACTIVITY FINANCEMENT MECHANIZM IMPROVEMENT

    OpenAIRE

    V.V. Morozov

    2005-01-01

    The strategy principles and main directions of regional investment activity financement mechanism improvement are formulated and worked out in the article. The contemporary conditions are analyzed, the factors are researched, the priority directions are defined, the suggestions on the better use of investment sources are worked out, and on this base the suggestions on the investment process activization in the territorial systems are worked out.

  16. Improving the methodology for assessing natural hazard impacts

    Science.gov (United States)

    Patwardhan, Anand; Sharma, Upasna

    2005-07-01

    The impacts of natural hazards such as cyclones have been conventionally measured through changes in human, social and economic capital, typically represented by stock variables such as population, built property and public infrastructure, livestock, agricultural land, etc. This paper develops an alternative approach that seeks to detect and quantify impacts as changes in flow variables. In particular, we explore whether changes in annual agricultural output, when measured at an appropriate spatial level, could be used to measure impacts associated with tropical cyclones in coastal regions of India. We believe that such an approach may have a number of benefits from a policy perspective, particularly with regard to the debate between relief versus recovery as disaster management strategies. A focus on flow variables is also likely to be more relevant and useful in developing countries; the maintenance of economic activity directly affects livelihood and is perhaps of greater importance than loss of built property or other physical capital.

  17. Improving Teaching Effectiveness through the Application of SPC Methodology

    Science.gov (United States)

    Cadden, David; Driscoll, Vincent; Thompson, Mark

    2008-01-01

    One method used extensively to aid in determining instruction effectiveness is Student Evaluations of Instruction (SEI). This paper examines the use of statistical Process Control charts as a way to correctly measure teaching effectiveness. This field studying SEIs has produced a significant literature. It is not surprising that there is…

  18. Improving Teaching Effectiveness through the Application of SPC Methodology

    Science.gov (United States)

    Cadden, David; Driscoll, Vincent; Thompson, Mark

    2008-01-01

    One method used extensively to aid in determining instruction effectiveness is Student Evaluations of Instruction (SEI). This paper examines the use of statistical Process Control charts as a way to correctly measure teaching effectiveness. This field studying SEIs has produced a significant literature. It is not surprising that there is…

  19. Residency Training: Quality improvement projects in neurology residency and fellowship: applying DMAIC methodology.

    Science.gov (United States)

    Kassardjian, Charles D; Williamson, Michelle L; van Buskirk, Dorothy J; Ernste, Floranne C; Hunderfund, Andrea N Leep

    2015-07-14

    Teaching quality improvement (QI) is a priority for residency and fellowship training programs. However, many medical trainees have had little exposure to QI methods. The purpose of this study is to review a rigorous and simple QI methodology (define, measure, analyze, improve, and control [DMAIC]) and demonstrate its use in a fellow-driven QI project aimed at reducing the number of delayed and canceled muscle biopsies at our institution. DMAIC was utilized. The project aim was to reduce the number of delayed muscle biopsies to 10% or less within 24 months. Baseline data were collected for 12 months. These data were analyzed to identify root causes for muscle biopsy delays and cancellations. Interventions were developed to address the most common root causes. Performance was then remeasured for 9 months. Baseline data were collected on 97 of 120 muscle biopsies during 2013. Twenty biopsies (20.6%) were delayed. The most common causes were scheduling too many tests on the same day and lack of fasting. Interventions aimed at patient education and biopsy scheduling were implemented. The effect was to reduce the number of delayed biopsies to 6.6% (6/91) over the next 9 months. Familiarity with QI methodologies such as DMAIC is helpful to ensure valid results and conclusions. Utilizing DMAIC, we were able to implement simple changes and significantly reduce the number of delayed muscle biopsies at our institution. © 2015 American Academy of Neurology.

  20. Improved methodology for developing cost uncertainty models for naval vessels

    OpenAIRE

    Brown, Cinda L.

    2008-01-01

    The purpose of this thesis is to analyze the probabilistic cost model currently in use by NAVSEA 05C to predict cost uncertainty in naval vessel construction and to develop a method that better predicts the ultimate cost risk. The data used to develop the improved approach is collected from analysis of the CG(X) class ship by NAVSEA 05C. The NAVSEA 05C cost risk factors are reviewed and analyzed to determine if different factors are better cost predictors. The impact of data elicitation, t...

  1. Process improvement by cycle time reduction through Lean Methodology

    Science.gov (United States)

    Siva, R.; patan, Mahamed naveed khan; lakshmi pavan kumar, Mane; Purusothaman, M.; pitchai, S. Antony; Jegathish, Y.

    2017-05-01

    In present world, every customer needs their products to get on time with good quality. Presently every industry is striving to satisfy their customer requirements. An aviation concern trying to accomplish continuous improvement in all its projects. In this project the maintenance service for the customer is analyzed. The maintenance part service is split up into four levels. Out of it, three levels are done in service shops and the fourth level falls under customer’s privilege to change the parts in their aircraft engines at their location. An enhancement for electronics initial provisioning (eIP) is done for fourth level. Customers request service shops to get their requirements through Recommended Spare Parts List (RSPL) by eIP. To complete this RSPL for one customer, it takes 61.5 hours as a cycle time which is very high. By mapping current state VSM and takt time, future state improvement can be done in order to reduce cycle time using Lean tools such as Poke-Yoke, Jidoka, 5S, Muda etc.,

  2. Recent improvements in the methodology of neutron imaging

    Indian Academy of Sciences (India)

    Eberhard H Lehmann

    2008-10-01

    The focus of this article is on further improvements of methods in neutron imaging: the increased spatial resolution for microtomography and options for energy- selective neutron imaging. Before going into details, some common statements are given in respect to the state-of-the-art in neutron imaging. A relation to the X-ray methods is mentioned, where complementary results are obtained. The potential for the energy selection is of particular interest for future installations at the new pulsed sources, based on spallation (SNS, J-PARC, ISIS-TS2). First results from preliminary studies look very promising for future material and industrial research. Therefore, statements about the set-up of the best possible imaging systems are included in the article.

  3. The methodologies and instruments of vehicle particulate emission measurement for current and future legislative regulations

    Science.gov (United States)

    Otsuki, Yoshinori; Nakamura, Hiroshi; Arai, Masataka; Xu, Min

    2015-09-01

    Since the health risks associated with fine particles whose aerodynamic diameters are smaller than 2.5 μm was first proven, regulations restricting particulate matter (PM) mass emissions from internal combustion engines have become increasingly severe. Accordingly, the gravimetric method of PM mass measurement is facing its lower limit of detection as the emissions from vehicles are further reduced. For example, the variation in the adsorption of gaseous components such as hydrocarbons from unburned fuel and lubricant oil and the presence of agglomerated particles, which are not directly generated in engine combustion but re-entrainment particulates from walls of sampling pipes, can cause uncertainty in measurement. The PM mass measurement systems and methodologies have been continuously refined in order to improve measurement accuracy. As an alternative metric, the particle measurement programme (PMP) within the United Nations Economic Commission for Europe (UNECE) developed a solid particle number measurement method in order to improve the sensitivity of particulate emission measurement from vehicles. Consequently, particle number (PN) limits were implemented into the regulations in Europe from 2011. Recently, portable emission measurement systems (PEMS) for in-use vehicle emission measurements are also attracting attention, currently in North America and Europe, and real-time PM mass and PN instruments are under evaluation.

  4. Quality improvement methodologies increase autologous blood product administration.

    Science.gov (United States)

    Hodge, Ashley B; Preston, Thomas J; Fitch, Jill A; Harrison, Sheilah K; Hersey, Diane K; Nicol, Kathleen K; Naguib, Aymen N; McConnell, Patrick I; Galantowicz, Mark

    2014-03-01

    Whole blood from the heart-lung (bypass) machine may be processed through a cell salvaging device (i.e., cell saver [CS]) and subsequently administered to the patient during cardiac surgery. It was determined at our institution that CS volume was being discarded. A multidisciplinary team consisting of anesthesiologists, perfusionists, intensive care physicians, quality improvement (QI) professionals, and bedside nurses met to determine the challenges surrounding autologous blood delivery in its entirety. A review of cardiac surgery patients' charts (n = 21) was conducted for analysis of CS waste. After identification of practices that were leading to CS waste, interventions were designed and implemented. Fishbone diagram, key driver diagram, Plan-Do-Study-Act (PDSA) cycles, and data collection forms were used throughout this QI process to track and guide progress regarding CS waste. Of patients under 6 kg (n = 5), 80% had wasted CS blood before interventions, whereas those patients larger than 36 kg (n = 8) had 25% wasted CS before interventions. Seventy-five percent of patients under 6 kg who had wasted CS blood received packed red blood cell transfusions in the cardiothoracic intensive care unit within 24 hours of their operation. After data collection and didactic education sessions (PDSA Cycle I), CS blood volume waste was reduced to 5% in all patients. Identification and analysis of the root cause followed by implementation of education, training, and management of change (PDSA Cycle II) resulted in successful use of 100% of all CS blood volume.

  5. Improving the Measurement of Poverty.

    Science.gov (United States)

    Hutto, Nathan; Waldfogel, Jane; Kaushal, Neeraj; Garfinkel, Irwin

    2011-03-01

    This study estimates 2007 national poverty rates using an approach largely conceptualized by a 1995 National Academy of Sciences panel and similar to the supplemental poverty measure that will soon be produced by the U.S. Census Bureau. The study uses poverty thresholds based on expenditures for shelter, food, clothing, and utilities, as well as a measure of family income that includes earnings, cash transfers, near-cash benefits, tax credits, and tax payments. The measure also accounts for child care, work, and out-of-pocket medical expenses; variation in regional cost of living; and mortgage-free homeownership. Under this method, the rate of poverty is estimated to be higher than the rate calculated in the traditional manner, rising from 12.4 percent in the official measure to 16 percent in the new measure; the rate of child poverty is more than 3 percentage points higher, and elderly poverty is nearly 7 points higher.

  6. Proposition of Improved Methodology in Creep Life Extrapolation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Woo Gon; Park, Jae Young; Jang, Jin Sung [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    To design SFRs for a 60-year operation, it is desirable to have the experimental creep-rupture data for Gr. 91 steel close to 20 y, or at least rupture lives significantly higher than 10{sup 5} h. This requirement arises from the fact that, for the creep design, a factor of 3 times for extrapolation is considered to be appropriate. However, obtaining experimental data close to 20 y would be expensive and also take considerable time. Therefore, reliable creep life extrapolation techniques become necessary for a safe design life of 60 y. In addition, it is appropriate to obtain experimental longterm creep-rupture data in the range 10{sup 5} ∼ 2x10{sup 5} h to improve the reliability of extrapolation. In the present investigation, a new function of a hyperbolic sine ('sinh') form for a master curve in time-temperature parameter (TTP) methods, was proposed to accurately extrapolate the long-term creep rupture stress of Gr. 91 steel. Constant values used for each parametric equation were optimized on the basis of the creep rupture data. Average stress values predicted for up to 60 y were evaluated and compared with those of French Nuclear Design Code, RCC-MRx. The results showed that the master curve of the 'sinh' function was a wider acceptance with good flexibility in the low stress ranges beyond the experimental data. It was clarified clarified that the 'sinh' function was reasonable in creep life extrapolation compared with polynomial forms, which have been used conventionally until now.

  7. A Multi-Methodology for improving Adelaide's Groundwater Management

    Science.gov (United States)

    Batelaan, Okke; Banks, Eddie; Batlle-Aguilar, Jordi; Breciani, Etienne; Cook, Peter; Cranswick, Roger; Smith, Stan; Turnadge, Chris; Partington, Daniel; Post, Vincent; Pool Ramirez, Maria; Werner, Adrian; Xie, Yueqing; Yang, Yuting

    2015-04-01

    Groundwater is a strategic and vital resource in South Australia playing a crucial role in sustaining a healthy environment, as well as supporting industries and economic development. In the Adelaide metropolitan region ten different aquifer units have been identified, extending to more than 500 m below sea level. Although salinity within most of these aquifers is variable, water suitable for commercial, irrigation and/or potable use is predominantly found in the deeper Tertiary aquifers. Groundwater currently contributes only 9000 ML/yr of Adelaide's total water consumption of 216,000 ML, while in the Northern Adelaide Plains 17000 ML/yr is used. However, major industries, market gardeners, golf courses, and local councils are highly dependent on this resource. Despite recent rapid expansion in managed aquifer recharge, and the potential for increased extraction of groundwater, particularly for the commercial and irrigation supplies, little is known about the sources and ages of Adelaide's groundwater. The aim of this study is therefore to provide a robust conceptualisation of Adelaide's groundwater system. The study focuses on three important knowledge gaps: 1. Does groundwater flow from the Adelaide Hills into the sedimentary aquifers on the plains? 2. What is the potential for encroachment of seawater if groundwater extraction increases? 3. How isolated are the different aquifers, or does water leak from one to the other? A multi-tool approach has been used to improve the conceptual understanding of groundwater flow processes; including the installation of new groundwater monitoring wells from the hills to the coast, an extensive groundwater sampling campaign of new and existing groundwater wells for chemistry and environmental tracers analysis, and development of a regional scale numerical model rigorously tested under different scenario conditions. The model allows quantification of otherwise hardly quantifiable quantities such as flow across fault zones and

  8. THE ASSESSMENT METHODOLOGIES PTELR, ADRI AND CAE – THREE METHODOLOGIES FOR COORDINATING THE EFFORTS TO IMPROVE THE ORGANIZATIONAL PROCESSES TO ACHIEVE EXCELLENCE

    OpenAIRE

    Cristina Raluca POPESCU; Gheorghe N. Popescu

    2015-01-01

    In the paper “The Assessment Methodologies PTELR, ADRI and CAE – Three Methodologies for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodologies PTELR, ADRI and CAE that are designed to coordinate the efforts to improve the organizational processes in order to achieve excellence. In the first part of the paper (the introduction of the paper), the authors present the general background concer...

  9. Methodology to Improving Energy Efficiency of Residential Historic Buildings in St. Petersburg

    Directory of Open Access Journals (Sweden)

    Murgul Vera

    2016-01-01

    Full Text Available The paper contains the analysis of the goals and objectives of improving energy efficiency of residential buildings, as well as the methodology of selecting energy-efficient modernization measures for historic buildings. The priority objective of this study was selected as a residential housing energy efficiency of historic buildings as a tool to improve the quality of the human environment. The development of renewable energy technologies is presented as an alternative for energy saving. If we take the purpose of improving energy efficiency as an improvement of the quality of the human environment (from the living quarters level to global environmental sustainability, the alternative to energy saving of traditional energy resources can be the replacement of them by energy from renewable sources, even lavish spending of which does not harm the environment . All energy saving should be focused primarily in man-made environments (industrial processes, heating systems and etc., the anthropogenic environment (living environment should get the maximum energy for the stable provision of quality microclimate.

  10. Theoretical and methodological considerations in the measurement of spasticity

    NARCIS (Netherlands)

    Burridge, J.H.; Wood, D.E.; Hermens, H.J.; Voerman, G.E.; Johnson, G.R.; Wijck, van F.; Platz, T.; Gregoric, M.; Hitchcock, R.A.D.; Pandyan, A.D.

    2005-01-01

    Purpose: To discuss the measurement of spasticity in the clinical and research environments, make recommendations based on the SPASM reviews of biomechanical, neurophysiological and clinical methods of measuring spasticity and indicate future developments of measurement tools. Method: Using the resu

  11. VAR Methodology Used for Exchange Risk Measurement and Prevention

    Directory of Open Access Journals (Sweden)

    Florentina Balu

    2006-05-01

    Full Text Available In this article we discuss one of the modern risk measuring techniques Value-at-Risk (VaR. Currently central banks in major money centers, under the auspices of the BIS Basle Committee, adopt the VaR system to evaluate the market risk of their supervised banks. Banks regulators ask all commercial banks to report VaRs with their internal models. Value at risk (VaR is a powerful tool for assessing market risk, but it also imposes a challenge. Its power is its generality. Unlike market risk metrics such as the Greeks, duration and convexity, or beta, which are applicable to only certain asset categories or certain sources of market risk, VaR is general. It is based on the probability distribution for a portfolio’s market value. Value at Risk (VAR calculates the maximum loss expected (or worst case scenario on an investment, over a given time period and given a specified degree of confidence. There are three methods by which VaR can be calculated: the historical simulation, the variance-covariance method and the Monte Carlo simulation. The variance-covariance method is easiest because you need to estimate only two factors: average return and standard deviation. However, it assumes returns are well-behaved according to the symmetrical normal curve and that historical patterns will repeat into the future. The historical simulation improves on the accuracy of the VAR calculation, but requires more computational data; it also assumes that “past is prologue”. The Monte Carlo simulation is complex, but has the advantage of allowing users to tailor ideas about future patterns that depart from historical patterns.

  12. VAR Methodology Used for Exchange Risk Measurement and Prevention

    Directory of Open Access Journals (Sweden)

    Ion Stancu

    2006-03-01

    Full Text Available In this article we discuss one of the modern risk measuring techniques Value-at-Risk (VaR. Currently central banks in major money centers, under the auspices of the BIS Basle Committee, adopt the VaR system to evaluate the market risk of their supervised banks. Banks regulators ask all commercial banks to report VaRs with their internal models. Value at risk (VaR is a powerful tool for assessing market risk, but it also imposes a challenge. Its power is its generality. Unlike market risk metrics such as the Greeks, duration and convexity, or beta, which are applicable to only certain asset categories or certain sources of market risk, VaR is general. It is based on the probability distribution for a portfolio’s market value. Value at Risk (VAR calculates the maximum loss expected (or worst case scenario on an investment, over a given time period and given a specified degree of confidence. There are three methods by which VaR can be calculated: the historical simulation, the variance-covariance method and the Monte Carlo simulation. The variance-covariance method is easiest because you need to estimate only two factors: average return and standard deviation. However, it assumes returns are well-behaved according to the symmetrical normal curve and that historical patterns will repeat into the future. The historical simulation improves on the accuracy of the VAR calculation, but requires more computational data; it also assumes that “past is prologue”. The Monte Carlo simulation is complex, but has the advantage of allowing users to tailor ideas about future patterns that depart from historical patterns.

  13. Using Quality Tools and Methodologies to Improve a Hospital's Quality Position.

    Science.gov (United States)

    Branco, Daniel; Wicks, Angela M; Visich, John K

    2017-01-01

    The authors identify the quality tools and methodologies most frequently used by quality-positioned hospitals versus nonquality hospitals. Northeastern U.S. hospitals in both groups received a brief, 12-question survey. The authors found that 93.75% of the quality hospitals and 81.25% of the nonquality hospitals used some form of process improvement methodologies. However, there were significant differences between the groups regarding the impact of quality improvement initiatives on patients. The findings indicate that in quality hospitals the use of quality improvement initiatives had a significantly greater positive impact on patient satisfaction and patient outcomes when compared to nonquality hospitals.

  14. Operational amplifier speed and accuracy improvement analog circuit design with structural methodology

    CERN Document Server

    Ivanov, Vadim V

    2004-01-01

    Operational Amplifier Speed and Accuracy Improvement proposes a new methodology for the design of analog integrated circuits. The usefulness of this methodology is demonstrated through the design of an operational amplifier. This methodology consists of the following iterative steps: description of the circuit functionality at a high level of abstraction using signal flow graphs; equivalent transformations and modifications of the graph to the form where all important parameters are controlled by dedicated feedback loops; and implementation of the structure using a library of elementary cells. Operational Amplifier Speed and Accuracy Improvement shows how to choose structures and design circuits which improve an operational amplifier's important parameters such as speed to power ratio, open loop gain, common-mode voltage rejection ratio, and power supply rejection ratio. The same approach is used to design clamps and limiting circuits which improve the performance of the amplifier outside of its linear operat...

  15. Measurement-based auralization methodology for the assessment of noise mitigation measures

    Science.gov (United States)

    Thomas, Pieter; Wei, Weigang; Van Renterghem, Timothy; Botteldooren, Dick

    2016-09-01

    The effect of noise mitigation measures is generally expressed by noise levels only, neglecting the listener's perception. In this study, an auralization methodology is proposed that enables an auditive preview of noise abatement measures for road traffic noise, based on the direction dependent attenuation of a priori recordings made with a dedicated 32-channel spherical microphone array. This measurement-based auralization has the advantage that all non-road traffic sounds that create the listening context are present. The potential of this auralization methodology is evaluated through the assessment of the effect of an L-shaped mound. The angular insertion loss of the mound is estimated by using the ISO 9613-2 propagation model, the Pierce barrier diffraction model and the Harmonoise point-to-point model. The realism of the auralization technique is evaluated by listening tests, indicating that listeners had great difficulty in differentiating between a posteriori recordings and auralized samples, which shows the validity of the followed approaches.

  16. Geisinger's ProvenCare methodology: driving performance improvement within a shared governance structure.

    Science.gov (United States)

    Nolan, Ruth; Wary, Andrea; King, Megan; Laam, Leslie A; Hallick, Susan

    2011-05-01

    Many performance improvement projects fail because they occur in parallel to the organization's shared governance structure. Leveraging the full potential of its nursing shared governance structure, Geisinger Health System's ProvenCare methodology harnessed the full potential of its staff nurses to create truly reliable workflows that benefit patients and that the team finds professionally satisfying. Using ProvenCare Perinatal and its smoking cessation education intervention and outcomes as an example, the authors describe the ProvenCare methodology.

  17. Development and Evaluation of an Improved Methodology for Assessing Adherence to Evidence-Based Drug Therapy Guidelines Using Claims Data

    Science.gov (United States)

    Kawamoto, Kensaku; Allen LaPointe, Nancy M.; Silvey, Garry M.; Anstrom, Kevin J.; Eisenstein, Eric L.; Lobach, David F.

    2007-01-01

    Non-adherence to evidence-based pharmacotherapy is associated with increased morbidity and mortality. Claims data can be used to detect and intervene on such non-adherence, but existing claims-based approaches for measuring adherence to pharmacotherapy guidelines have significant limitations. In this manuscript, we describe a methodology for assessing adherence to pharmacotherapy guidelines that overcomes many of these limitations. To develop this methodology, we first reviewed the literature to identify prior work on potential strategies for overcoming these limitations. We then assembled a team of relevant domain experts to iteratively develop an improved methodology. This development process was informed by the use of the proposed methodology to assess adherence levels for 14 pharmacotherapy guidelines related to seven common diseases among approximately 36,000 Medicaid beneficiaries. Finally, we evaluated the ability of the methodology to overcome the targeted limitations. Based on this evaluation, we conclude that the proposed methodology overcomes many of the limitations associated with existing approaches. PMID:18693865

  18. A conceptual and methodological framework for psychometric isomorphism: : Validation of multilevel construct measures

    NARCIS (Netherlands)

    Tay, L.; Woo, S.E.; Vermunt, J.K.

    2014-01-01

    The conceptual and methodological framework for measurement equivalence procedures has been well established and widely used. Although multilevel theories and methods have been widely used in organizational research, there is no comparable framework for measurement equivalence of multilevel

  19. Measurement of quality of life I. A methodological framework

    DEFF Research Database (Denmark)

    Ventegodt, Søren; Hilden, Jørgen; Merrick, Joav

    2003-01-01

    meet to provide a sound basis for investigation by questionnaire. The seven criteria or desiderata are: (1) an explicit definition of quality of life; (2) a coherent philosophy of human life from which the definition is derived; (3) a theory that operationalizes the philosophy by specifying unambiguous......Despite the widespread acceptance of quality of life (QOL) as the ideal guideline in healthcare and clinical research, serious conceptual and methodological problems continue to plague this area. In an attempt to remedy this situation, we propose seven criteria that a quality-of-life concept must...... guided the design of a validated 5-item generic, global quality-of-life questionnaire (QOL5), and a validated 317-item generic, global quality-of-life questionnaire (SEQOL), administered to a well-documented birth cohort of 7,400 Danes born in 1959-1961, as well as to a reference sample of 2,500 Danes...

  20. A Methodology to Measure the Environmental Impact of ICT Operating Systems across Different Device Platforms

    Institute of Scientific and Technical Information of China (English)

    Daniel R. Williams; Yinshan Tang

    2015-01-01

    Abstract-A new methodology was created to measure the energy consumption and related green house gas (GHG) emissions of a computer operating system (OS) across different device platforms. The methodology involved the direct power measurement of devices under different activity states. In order to include all aspects of an OS, the methodology included measurements in various OS modes, whilst uniquely, also incorporating measurements when running an array of defined software activities, so as to include OS application management features. The methodology was demonstrated on a laptop and phone that could each run multiple OSs, results confirmed that OS can significantly impact the energy consumption of devices. In particular, the new versions of the Microsoft Windows OS were tested and highlighted significant differences between the OS versions on the same hardware. The developed methodology could enable a greater awareness of energy consumption, during both the software development and software marketing processes.

  1. A Measurement Approach for Process Improvement | Woherem ...

    African Journals Online (AJOL)

    LBS Management Review ... Many organisations today have embarked on a process improvement or re-engineering project. ... are used in the measurement of software processes, REFINE is used for the measurement of business processes.

  2. Clinical audit, a valuable tool to improve quality of care: General methodology and applications in nephrology.

    Science.gov (United States)

    Esposito, Pasquale; Dal Canton, Antonio

    2014-11-06

    Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings.

  3. Improving hospital discharge time: a successful implementation of Six Sigma methodology.

    Science.gov (United States)

    El-Eid, Ghada R; Kaddoum, Roland; Tamim, Hani; Hitti, Eveline A

    2015-03-01

    Delays in discharging patients can impact hospital and emergency department (ED) throughput. The discharge process is complex and involves setting specific challenges that limit generalizability of solutions. The aim of this study was to assess the effectiveness of using Six Sigma methods to improve the patient discharge process. This is a quantitative pre and post-intervention study. Three hundred and eighty-six bed tertiary care hospital. A series of Six Sigma driven interventions over a 10-month period. The primary outcome was discharge time (time from discharge order to patient leaving the room). Secondary outcome measures included percent of patients whose discharge order was written before noon, percent of patients leaving the room by noon, hospital length of stay (LOS), and LOS of admitted ED patients. Discharge time decreased by 22.7% from 2.2 hours during the preintervention period to 1.7 hours post-intervention (P Six Sigma methodology can be an effective change management tool to improve discharge time. The focus of institutions aspiring to tackle delays in the discharge process should be on adopting the core principles of Six Sigma rather than specific interventions that may be institution-specific.

  4. Measurement and verification of low income energy efficiency programs in Brazil: Methodological challenges

    Energy Technology Data Exchange (ETDEWEB)

    Martino Jannuzzi, Gilberto De; Rodrigues da Silva, Ana Lucia; Melo, Conrado Augustus de; Paccola, Jose Angelo; Dourado Maia Gomes, Rodolfo (State Univ. of Campinas, International Energy Initiative (Brazil))

    2009-07-01

    Electric utilities in Brazil are investing about 80 million dollars annually in low-income energy efficiency programs, about half of their total compulsory investments in end-use efficiency programs under current regulation. Since 2007 the regulator has enforced the need to provide evaluation plans for the programs delivered. This paper presents the measurement and verification (MandV) methodology that has been developed to accommodate the characteristics of lighting and refrigerator programs that have been introduced in the Brazilian urban and peri-urban slums. A combination of household surveys, end-use measurements and metering at the transformers and grid levels were performed before and after the program implementation. The methodology has to accommodate the dynamics, housing, electrical wiring and connections of the population as well as their ability to pay for the electricity and program participation. Results obtained in slums in Rio de Janeiro are presented. Impacts of the programs were evaluated in energy terms to households and utilities. Feedback from the evaluations performed also permitted the improvement in the design of new programs for low-income households.

  5. Does updating improve the methodological and reporting quality of systematic reviews?

    Directory of Open Access Journals (Sweden)

    Hamel Candyce

    2006-06-01

    Full Text Available Abstract Background Systematic reviews (SRs must be of high quality. The purpose of our research was to compare the methodological and reporting quality of original versus updated Cochrane SRs to determine whether updating had improved these two quality dimensions. Methods We identifed updated Cochrane SRs published in issue 4, 2002 of the Cochrane Library. We assessed the updated and original versions of the SRs using two instruments: the 10 item enhanced Overview Quality Assessment Questionnaire (OQAQ, and an 18-item reporting quality checklist and flow chart based upon the Quality of Reporting of Meta-analyses (QUOROM statement. At least two reviewers extracted data and assessed quality. We calculated the percentage (with a 95% confidence interval of 'yes' answers to each question. We calculated mean differences in percentage, 95% confidence intervals and p-values for each of the individual items and the overall methodological quality score of the updated and pre-updated versions using OQAQ. Results We assessed 53 SRs. There was no significant improvement in the global quality score of the OQAQ (mean difference 0.11 (-0.28; 0.70 p = 0.52. Updated reviews showed a significant improvement of 18.9 (7.2; 30.6 p Conclusion The overall quality of Cochrane SRs is fair-to-good. Although reporting quality improved on certain individual items there was no overall improvement seen with updating and methodological quality remained unchanged. Further improvement of quality of reporting is possible. There is room for improvement of methodological quality as well. Authors updating reviews should address identified methodological or reporting weaknesses. We recommend to give full attention to both quality domains when updating SRs.

  6. MEASURING IMPROVEMENT IN WELL-BEING

    OpenAIRE

    Satya R. Chakravarty; Mukherjee, Diganta

    1999-01-01

    A measure of improvement in well-being aggregates increments in the attainment levels of different quality-of-life attributes. This paper first characterizes the entire family of additive improvement indices, where additivity requires that the overall index can be expressed as the arithmetic average of attribute-wise indices. Then we suggest a general family of nonadditive improvement indices, of which the Tsui (1996) index becomes a particular case. Both additive and nonadditive measures are...

  7. Sensible organizations: technology and methodology for automatically measuring organizational behavior.

    Science.gov (United States)

    Olguin Olguin, Daniel; Waber, Benjamin N; Kim, Taemie; Mohan, Akshay; Ara, Koji; Pentland, Alex

    2009-02-01

    We present the design, implementation, and deployment of a wearable computing platform for measuring and analyzing human behavior in organizational settings. We propose the use of wearable electronic badges capable of automatically measuring the amount of face-to-face interaction, conversational time, physical proximity to other people, and physical activity levels in order to capture individual and collective patterns of behavior. Our goal is to be able to understand how patterns of behavior shape individuals and organizations. By using on-body sensors in large groups of people for extended periods of time in naturalistic settings, we have been able to identify, measure, and quantify social interactions, group behavior, and organizational dynamics. We deployed this wearable computing platform in a group of 22 employees working in a real organization over a period of one month. Using these automatic measurements, we were able to predict employees' self-assessments of job satisfaction and their own perceptions of group interaction quality by combining data collected with our platform and e-mail communication data. In particular, the total amount of communication was predictive of both of these assessments, and betweenness in the social network exhibited a high negative correlation with group interaction satisfaction. We also found that physical proximity and e-mail exchange had a negative correlation of r = -0.55 (p 0.01), which has far-reaching implications for past and future research on social networks.

  8. Methodological issues in measuring innovation performance of spatial units

    NARCIS (Netherlands)

    Brenner, T.; Broekel, T.

    2011-01-01

    The innovation performance of regions or nations has been repeatedly measured in the literature. What is missing, however, is a discussion of what innovation performance of a region or nation means. How do regions or nations exactly contribute to the innovation output of firms? And how can this cont

  9. Practical remarks on the heart rate and saturation measurement methodology

    Science.gov (United States)

    Kowal, M.; Kubal, S.; Piotrowski, P.; Staniec, K.

    2017-05-01

    A surface reflection-based method for measuring heart rate and saturation has been introduced as one having a significant advantage over legacy methods in that it lends itself for use in special applications such as those where a person’s mobility is of prime importance (e.g. during a miner’s work) and excluding the use of traditional clips. Then, a complete ATmega1281-based microcontroller platform has been described for performing computational tasks of signal processing and wireless transmission. In the next section remarks have been provided regarding the basic signal processing rules beginning with raw voltage samples of converted optical signals, their acquisition, storage and smoothing. This chapter ends with practical remarks demonstrating an exponential dependence between the minimum measurable heart rate and the readout resolution at different sampling frequencies for different cases of averaging depth (in bits). The following section is devoted strictly to the heart rate and hemoglobin oxygenation (saturation) measurement with the use of the presented platform, referenced to measurements obtained with a stationary certified pulsoxymeter.

  10. Measure of Landscape Heterogeneity by Agent-Based Methodology

    Science.gov (United States)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  11. Methodological issues in measuring innovation performance of spatial units

    NARCIS (Netherlands)

    Brenner, T.; Broekel, T.

    2011-01-01

    The innovation performance of regions or nations has been repeatedly measured in the literature. What is missing, however, is a discussion of what innovation performance of a region or nation means. How do regions or nations exactly contribute to the innovation output of firms? And how can this cont

  12. Flux Measurements in Trees: Methodological Approach and Application to Vineyards

    Directory of Open Access Journals (Sweden)

    Francesca De Lorenzi

    2008-03-01

    Full Text Available In this paper a review of two sap flow methods for measuring the transpiration in vineyards is presented. The objective of this work is to examine the potential of detecting transpiration in trees in response to environmental stresses, particularly the high concentration of ozone (O3 in troposphere. The methods described are the stem heat balance and the thermal dissipation probe; advantages and disadvantages of each method are detailed. Applications of both techniques are shown, in two large commercial vineyards in Southern Italy (Apulia and Sicily, submitted to semi-arid climate. Sap flow techniques allow to measure transpiration at plant scale and an upscaling procedure is necessary to calculate the transpiration at the whole stand level. Here a general technique to link the value of transpiration at plant level to the canopy value is presented, based on experimental relationships between transpiration and biometric characteristics of the trees. In both vineyards transpiration measured by sap flow methods compares well with evapotranspiration measured by micrometeorological techniques at canopy scale. Moreover soil evaporation component has been quantified. In conclusion, comments about the suitability of the sap flow methods for studying the interactions between trees and ozone are given.

  13. METHODOLOGY FOR EVALUATING AND IMPROVING THE EFFECTIVENESS OF THE STATE DEFENSE CAPABILITIES

    Directory of Open Access Journals (Sweden)

    Vikulov S. F.

    2015-04-01

    Full Text Available In the article we have developed a methodology to assess the effectiveness of various types of military and economic activities, the effectiveness of which determines the value of the defense capabilities of the state; we have also revealed the economic substance of defense capabilities and combat readiness of troops, justified approaches to determining the significance of ongoing military-oriented activities, identified main activities of the specific conditions of defense industries which leave their marks on the cost-effectiveness in the production of material resources for the Russian military service. The analysis of the flow direction of monetary environment of estimated units indicates that the current practice of planning and cost accounting is not fully adapted for military-economic analysis of measures to ensure the establishment of the defense capabilities of the state, and this makes it difficult to study and make recommendations to improve the efficiency of the use of military and economic resources. The authors propose a unified system of methodological support planning, estimating the size of consumed resources (regardless of funding source and attainable defense results. Methodical resolution of this problem is based on the use of program-target approach to modeling the processes of the defense-industrial complex and the structural elements of the military organization. The article substantiates that the evaluation of the effectiveness of the use of defense capabilities should be carried out not only by the criterion of "cost - effect", but with all the manifestations of the time factor. In the most general form of the criterion of effectiveness should be provided by the triad "cost - effect - time." An important result of this study is also in the development of the integral index of evaluating the effectiveness of the use of military and economic

  14. Measurement of backscatter factor for diagnostic radiology: methodology and uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rosado, P.H.G.; Nogueira, M.D.S.; Squair, P.L.; Da Silva, T.A. [Centro de Desenvolvimento da Tecnoogia Nuclear (CDTN/CNEN) 30123-970, Minas Gerais (Brazil)]. e-mail: phgr@cdtn.br

    2007-07-01

    Full text: Backscatter factors were experimentally determined for the diagnostic X-ray qualities recommended by the International Electrotechnical Commission (IEC) for primary beams (RQR). Harshaw LiF-1 100H thermoluminescent dosemeters used for determining the backscatter were calibrated against an ionization chamber traceable to the National Metrology Laboratory. A 300mm x 300mm x 150mm PMMA slab phantom was used for deep-doses measurements. To perform the in-phantom measurements, the dosemeters were placed in the central axis of the x-ray beam at five different depths d in the phantom (5, 10, 15, 25 and 35 mm) upstream the beam direction. The typical combined standard uncertainty of the backscatter factor value was 6%. The main sources of uncertainties were the calibration procedure, the TLD dosimetry and the use of deep-dose curves. (Author)

  15. Powerplant productivity improvement study: demonstration of the DOE/MRI methodology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-11-17

    A methodology, distributed by DOE, may be used as a potential tool for the utility industry to assess and evaluate productivity and reliability problems of existing and future units. The model can be used to assess the impact of an improvement project or corrective action on unit productivity parameters such as heat rate, capacity factor, availability, or equivalent availability. The methodology analyzes: the effect of component redundancy on unit equivalent availability; the effect of component reliability on unit equivalent availability; and a component's outage trend to project the component's future outage rate. The first purpose of this study is to illustrate for the utilities how to apply the methodology. Thus, this report contains the analysis performed by applying the methodology at three units of Illinois power plants. The units evaluated were Wood River 5, Quad Cities 1, and Quad Cities 2. A total of 8 improvement projects were evaluated and all outage data and equations used are included in order to facilitate utility reproduction of the given results. The second purpose is to evaluate the methodology as a useful tool for individual utilities; a number of modifications are suggested.

  16. Measure to Succeed: How to Improve Employee Participation in Continuous Improvement

    OpenAIRE

    Daniel Jurburg; Elisabeth Viles; Martin Tanco; Ricardo Mateo; Alvaro Lleó

    2016-01-01

    Purpose: Achieving employee participation in continuous improvement (CI) systems is considered as one of the success factors for the sustainability of those systems. Yet, it is also very difficult to obtain because of the interaction of many critical factors that affect employee participation. Therefore, finding ways of measuring all these critical factors can help practitioners manage the employee participation process accordingly. Design/methodology/approach: Based upon the existing lit...

  17. Validation of SEACAB Methodology with Frascati (FNG) Photon Dose Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tore, C.; Ortego, P.; Rodriguez Rivada, A.

    2014-07-01

    In the operation of the International Thermonuclear Experimental Reactor (ITER) the correct estimation of the gamma dose rate produced from the structural materials after shut down is one of the important safety parameter for hands-on maintenance. SEACAB, a rigorous 2-step (R2S) computational method has been developed for the calculation of residual dose in 3-D geometry with the use of the MCNP5 and of the ACAB (ACtivation ABacus) inventory code. The method is very efficient in hardware requirements being essentially modular. Starting from a single MCNP5 run permits a progressive improvement in the spatial detail of the material layers for the activation calculation and obtains separated source distributions for the isotopes contributing to the photon dose. (Author)

  18. Analysis of bandwidth measurement methodologies over WLAN systems

    CERN Document Server

    Portoles-Comeras, Marc; Mangues-Bafalluy, Josep; Domingo-Pascual, Jordi

    2009-01-01

    WLAN devices have become a fundamental component of nowadays network deployments. However, even though traditional networking applications run mostly unchanged over wireless links, the actual interaction between these applications and the dynamics of wireless transmissions is not yet fully understood. An important example of such applications are bandwidth estimation tools. This area has become a mature research topic with well-developed results. Unfortunately recent studies have shown that the application of these results to WLAN links is not straightforward. The main reasons for this is that the assumptions taken to develop bandwidth measurements tools do not hold any longer in the presence of wireless links (e.g. non-FIFO scheduling). This paper builds from these observations and its main goal is to analyze the interaction between probe packets and WLAN transmissions in bandwidth estimation processes. The paper proposes an analytical model that better accounts for the particularities of WLAN links. The mod...

  19. An efficient and improved methodology for the screening of industrially valuable xylano-pectino-cellulolytic microbes.

    Science.gov (United States)

    Singh, Avtar; Kaur, Amanjot; Dua, Anita; Mahajan, Ritu

    2015-01-01

    Xylano-pectino-cellulolytic enzymes are valuable enzymes of the industrial sector. In our earlier study, we have reported a novel and cost effective methodology for the qualitative screening of cellulase-free xylano-pectinolytic microorganisms by replacing the commercial, highly expensive substrates with agricultural residues, but the microorganisms with xylanolytic, pectinolytic, cellulolytic, xylano-pectinolytic, xylano-cellulolytic, pectino-cellulolytic, and xylano-pectino-cellulolytic potential were obtained. The probability of getting the desired combination was low, so efforts were made to further improve this cost effective methodology for obtaining the high yield of the microbes capable of producing desired combination of enzymes. By inclusion of multiple enrichment steps in sequence, using only practically low cost substrates and without any nutrient media till primary screening stage, this improved novel protocol for screening gave only the desired microorganisms with xylano-pectino-cellulolytic activity. Using this rapid, efficient, cost effective, and improved methodology, microbes with required combination of enzymes can be obtained and the probability of getting the desired microorganisms is cent percent. This is the first report presenting the methodology for the isolation of xylano-pectino-cellulolytic positive microorganisms at low cost and consuming less time.

  20. Calf venous compliance measured by venous occlusion plethysmography: methodological aspects.

    Science.gov (United States)

    Skoog, Johan; Zachrisson, Helene; Lindenberger, Marcus; Ekman, Mikael; Ewerman, Lea; Länne, Toste

    2015-02-01

    Calf venous compliance (C calf) is commonly evaluated with venous occlusion plethysmography (VOP) during a standard cuff deflation protocol. However, the technique relies on two not previously validated assumptions concerning thigh cuff pressure (P cuff) transmission and the impact of net fluid filtration (F filt) on C calf. The aim was to validate VOP in the lower limb and to develop a model to correct for F filt during VOP. Strain-gauge technique was used to study calf volume changes in 15 women and 10 age-matched men. A thigh cuff was inflated to 60 mmHg for 4 and 8 min with a subsequent decrease of 1 mmHg s(-1). Intravenous pressure (P iv) was measured simultaneously. C calf was determined with the commonly used equation [Compliance = β 1 + 2β 2 × P cuff] describing the pressure-compliance relationship. A model was developed to identify and correct for F filt. Transmission of P cuff to P iv was 100 %. The decrease in P cuff correlated well with P iv reduction (r = 0.99, P < 0.001). Overall, our model showed that C calf was underestimated when F filt was not accounted for (all P < 0.01). F filt was higher in women (P < 0.01) and showed a more pronounced effect on C calf compared to men (P < 0.05). The impact of F filt was similar during 4- and 8-min VOP. P cuff is an adequate substitute for P iv in the lower limb. F filt is associated with an underestimation of C calf and differences in the effect of F filt during VOP can be accounted for with the correction model. Thus, our model seems to be a valuable tool in future studies of venous wall function.

  1. Methodologies for Measuring Judicial Performance: The Problem of Bias

    Directory of Open Access Journals (Sweden)

    Jennifer Elek

    2014-12-01

    Full Text Available Concerns about gender and racial bias in the survey-based evaluations of judicial performance common in the United States have persisted for decades. Consistent with a large body of basic research in the psychological sciences, recent studies confirm that the results from these JPE surveys are systematically biased against women and minority judges. In this paper, we explain the insidious manner in which performance evaluations may be biased, describe some techniques that may help to reduce expressions of bias in judicial performance evaluation surveys, and discuss the potential problem such biases may pose in other common methods of performance evaluation used in the United States and elsewhere. We conclude by highlighting the potential adverse consequences of judicial performance evaluation programs that rely on biased measurements. Durante décadas ha habido una preocupación por la discriminación por género y racial en las evaluaciones del rendimiento judicial basadas en encuestas, comunes en Estados Unidos. De acuerdo con un gran corpus de investigación básica en las ciencias psicológicas, estudios recientes confirman que los resultados de estas encuestas de evaluación del rendimiento judicial están sistemáticamente sesgados contra las mujeres y los jueces de minorías. En este artículo se explica la manera insidiosa en que las evaluaciones de rendimiento pueden estar sesgadas, se describen algunas técnicas que pueden ayudar a reducir las expresiones de sesgo en los estudios de evaluación del rendimiento judicial, y se debate el problema potencial que estos sesgos pueden plantear en otros métodos comunes de evaluación del rendimiento utilizados en Estados Unidos y otros países. Se concluye destacando las posibles consecuencias adversas de los programas de evaluación del rendimiento judicial que se basan en mediciones sesgadas. DOWNLOAD THIS PAPER FROM SSRN: http://ssrn.com/abstract=2533937

  2. A review of fetal volumetry: the need for standardization and definitions in measurement methodology.

    Science.gov (United States)

    Ioannou, C; Sarris, I; Salomon, L J; Papageorghiou, A T

    2011-12-01

    Volume charts of fetal organs and structures vary considerably among studies. This review identified 42 studies reporting normal volumes, namely for fetal brain (n = 3), cerebellum (n = 4), liver (n = 6), femur (n = 2), lungs (n = 15), kidneys (n = 3) and first-trimester embryo (n = 9). The differences among median volumes were expressed both in percentage form and as standard deviation scores. Wide discrepancies in reported normal volumes make it extremely difficult to diagnose pathological organ growth reliably. Given its magnitude, this variation is likely to be due to inconsistencies in volumetric methodology, rather than population differences. Complicating factors include the absence of clearly defined anatomical landmarks for measurement; inadequate assessment and reporting of method repeatability; the inherent difficulty in validating fetal measurements in vivo against a reference standard; and a multitude of mutually incompatible three-dimensional (3D) imaging formats and software measuring tools. An attempt to standardize these factors would improve intra- and inter-researcher agreement concerning reported volumetric measures, would allow generalization of reference data across different populations and different ultrasound systems, and would allow quality assurance in 3D fetal biometry. Failure to ensure a quality control process may hamper the wide use of 3D ultrasound. Copyright © 2011 ISUOG. Published by John Wiley & Sons, Ltd.

  3. THE ASSESSMENT METHODOLOGIES PTELR, ADRI AND CAE – THREE METHODOLOGIES FOR COORDINATING THE EFFORTS TO IMPROVE THE ORGANIZATIONAL PROCESSES TO ACHIEVE EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Cristina Raluca POPESCU

    2015-07-01

    Full Text Available In the paper “The Assessment Methodologies PTELR, ADRI and CAE – Three Methodologies for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodologies PTELR, ADRI and CAE that are designed to coordinate the efforts to improve the organizational processes in order to achieve excellence. In the first part of the paper (the introduction of the paper, the authors present the general background concerning the performance of management business processes and the importance of achieving excellence and furthermore correctly assessing/evaluating it. Aspects such as quality, quality control, quality assurance, performance and excellence are brought into discussion in the context generated by globalization, new technologies and new business models. Moreover, aspects regarding the methods employed to ensure the quality, maintaining it and continuous improvements, as well as total quality management, are also main pillars of this current research. In the content of the paper (the assessment methodologies PTELR, ADRI and CAE – as methodologies for coordinating the efforts to improve the organizational processes to achieve excellence, the authors describe the characteristics of the assessment methodologies PTELR, ADRI and CAE from a theoretical point of view.

  4. Improving Students' Understanding of Quantum Measurement

    CERN Document Server

    Zhu, Guangtian

    2016-01-01

    We describe the difficulties advanced undergraduate and graduate students have with quantum measurement. To reduce these difficulties, we have developed research-based learning tools such as the Quantum Interactive Learning Tutorial (QuILT) and peer instruction tools. A preliminary evaluation shows that these learning tools are effective in improving students' understanding of concepts related to quantum measurement.

  5. Improved extraction of information in bioimpedance measurements

    Science.gov (United States)

    Min, Mart; Paavle, Toivo

    2013-04-01

    A wideband bioimpedance measurement method is proposed, which can enhance the interpretation of measurement results due to the improved resolution of monitoring. On the other hand, the corresponding measurement system uses a binary chirp waveform for excitation signal, which simplifies the signal processing hardware and does not require sophisticated software. It is shown that the binary chirp excitation has some essential advantages compared to its counterpart - the maximum length sequence (MLS) excitation.

  6. A call to improve sampling methodology and reporting in young novice driver research.

    Science.gov (United States)

    Scott-Parker, B; Senserrick, T

    2017-02-01

    Young drivers continue to be over-represented in road crash fatalities despite a multitude of research, communication and intervention. Evidence-based improvement depends to a great extent upon research methodology quality and its reporting, with known limitations in the peer-review process. The aim of the current research was to review the scope of research methodologies applied in 'young driver' and 'teen driver' research and their reporting in four peer-review journals in the field between January 2006 and December 2013. In total, 806 articles were identified and assessed. Reporting omissions included participant gender (11% of papers), response rates (49%), retention rates (39%) and information regarding incentives (44%). Greater breadth and specific improvements in study designs and reporting are thereby identified as a means to further advance the field.

  7. Systematic review of the application of quality improvement methodologies from the manufacturing industry to surgical healthcare.

    Science.gov (United States)

    Nicolay, C R; Purkayastha, S; Greenhalgh, A; Benn, J; Chaturvedi, S; Phillips, N; Darzi, A

    2012-03-01

    The demand for the highest-quality patient care coupled with pressure on funding has led to the increasing use of quality improvement (QI) methodologies from the manufacturing industry. The aim of this systematic review was to identify and evaluate the application and effectiveness of these QI methodologies to the field of surgery. MEDLINE, the Cochrane Database, Allied and Complementary Medicine Database, British Nursing Index, Cumulative Index to Nursing and Allied Health Literature, Embase, Health Business(™) Elite, the Health Management Information Consortium and PsycINFO(®) were searched according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement. Empirical studies were included that implemented a described QI methodology to surgical care and analysed a named outcome statistically. Some 34 of 1595 articles identified met the inclusion criteria after consensus from two independent investigators. Nine studies described continuous quality improvement (CQI), five Six Sigma, five total quality management (TQM), five plan-do-study-act (PDSA) or plan-do-check-act (PDCA) cycles, five statistical process control (SPC) or statistical quality control (SQC), four Lean and one Lean Six Sigma; 20 of the studies were undertaken in the USA. The most common aims were to reduce complications or improve outcomes (11), to reduce infection (7), and to reduce theatre delays (7). There was one randomized controlled trial. QI methodologies from industry can have significant effects on improving surgical care, from reducing infection rates to increasing operating room efficiency. The evidence is generally of suboptimal quality, and rigorous randomized multicentre studies are needed to bring evidence-based management into the same league as evidence-based medicine. Copyright © 2011 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd.

  8. Rapid Process Optimization: A Novel Process Improvement Methodology to Innovate Health Care Delivery.

    Science.gov (United States)

    Wiler, Jennifer L; Bookman, Kelly; Birznieks, Derek B; Leeret, Robert; Koehler, April; Planck, Shauna; Zane, Richard

    2016-03-26

    Health care systems have utilized various process redesign methodologies to improve care delivery. This article describes the creation of a novel process improvement methodology, Rapid Process Optimization (RPO). This system was used to redesign emergency care delivery within a large academic health care system, which resulted in a decrease: (1) door-to-physician time (Department A: 54 minutes pre vs 12 minutes 1 year post; Department B: 20 minutes pre vs 8 minutes 3 months post), (2) overall length of stay (Department A: 228 vs 184; Department B: 202 vs 192), (3) discharge length of stay (Department A: 216 vs 140; Department B: 179 vs 169), and (4) left without being seen rates (Department A: 5.5% vs 0.0%; Department B: 4.1% vs 0.5%) despite a 47% increased census at Department A (34 391 vs 50 691) and a 4% increase at Department B (8404 vs 8753). The novel RPO process improvement methodology can inform and guide successful care redesign.

  9. Quick Green Scan: A Methodology for Improving Green Performance in Terms of Manufacturing Processes

    Directory of Open Access Journals (Sweden)

    Aldona Kluczek

    2017-01-01

    Full Text Available The heating sector has begun implementing technologies and practices to tackle the environmental and social–economic problems caused by their production process. The purpose of this paper is to develop a methodology, “the Quick-Green-Scan”, that caters for the need of quick assessment decision-makers to improve green manufacturing performance in companies that produce heating devices. The study uses a structured approach that integrates Life Cycle Assessment-based indicators, framework and linguistic scales (fuzzy numbers to evaluate the extent of greening of the enterprise. The evaluation criteria and indicators are closely related to the current state of technology, which can be improved. The proposed methodology has been created to answer the question whether a company acts on the opportunity to be green and whether these actions are contributing towards greening, maintaining the status quo or moving away from a green outcome. Results show that applying the proposed improvements in processes helps move the facility towards being a green enterprise. Moreover, the methodology, being particularly quick and simple, is a practical tool for benchmarking, not only in the heating industry, but also proves useful in providing comparisons for facility performance in other manufacturing sectors.

  10. A methodology for evaluating air pollution strategies to improve the air quality in Mexico City

    Energy Technology Data Exchange (ETDEWEB)

    Barrera-Roldan, A.S.; Guzman, F. [Instituto Mexicano de Petroleo, Mexico City (Mexico); Hardie, R.W.; Thayer, G.R. [Los Alamos National Lab., NM (United States)

    1995-05-01

    The Mexico City Air Quality Research Initiative has developed a methodology to assist decision makers in determining optimum pollution control strategies for atmospheric pollutants. The methodology introduces both objective and subjective factors in the comparison of various strategies for improving air quality. Strategies or group of options are first selected using linear programming. These strategies are then compared using Multi-Attribute Decision Analysis. The decision tree for the Multi-Attribute Decision Analysis was generated by a panel of experts representing the organizations in Mexico that are responsible for formulating policy on air quality improvement. Three sample strategies were analyzed using the methodology: one to reduce ozone by 33% using the most cost effective group of options, the second to reduce ozone by 43% using the most cost effective group of options and the third to reduce ozone by 43% emphasizing the reduction of emissions from industrial sources. Of the three strategies, the analysis indicated that strategy 2 would be the preferred strategy for improving air quality in Mexico City.

  11. Radionuclide measurements, via different methodologies, as tool for geophysical studies on Mt. Etna

    Energy Technology Data Exchange (ETDEWEB)

    Morelli, D., E-mail: daniela.morelli@ct.infn.it [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Istituto Nazionale di Fisica Nucleare- Sezione di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Imme, G. [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Istituto Nazionale di Fisica Nucleare- Sezione di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Altamore, I.; Cammisa, S. [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Giammanco, S. [Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania, Piazza Roma, 2, I-95123 Catania (Italy); La Delfa, S. [Dipartimento di Scienze Geologiche, Universita di Catania, Corso Italia,57 I-95127 Catania (Italy); Mangano, G. [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Neri, M. [Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania, Piazza Roma, 2, I-95123 Catania (Italy); Patane, G. [Dipartimento di Scienze Geologiche, Universita di Catania, Corso Italia,57 I-95127 Catania (Italy)

    2011-10-01

    Natural radioactivity measurements represent an interesting tool to study geodynamical events or soil geophysical characteristics. In this direction we carried out, in the last years, several radionuclide monitoring both in the volcanic and tectonic areas of the oriental Sicily. In particular we report in-soil Radon investigations, in a tectonic area, including both laboratory and in-site measurements, applying three different methodologies, based on both active and passive detection systems. The active detection devices consisted of solid-state silicon detectors equipped in portable systems for short-time measurements and for long-time monitoring. The passive technique consisted of solid-state nuclear track detectors (SSNTD), CR-39 type, and allowed integrated measurements. The performances of the three methodologies were compared according to different kinds of monitoring. In general the results obtained with the three methodologies seem in agreement with each other and reflect the tectonic settings of the investigated area.

  12. Performance Evaluation and Measurement of the Organization in Strategic Analysis and Control: Methodological Aspects

    Directory of Open Access Journals (Sweden)

    Živan Ristić

    2006-12-01

    Full Text Available Information acquired by measuring and evaluation are a necessary condition for good decision-making in strategic management. This work deals with : (a Methodological aspects of evaluation (kinds of evaluation, metaevaluation and measurement (supposition of isomorphism in measurement, kinds and levels of measurement, errors in measurement and the basic characteristics of measurement (b Evaluation and measurement of potential and accomplishments of the organization in Kaplan-Norton perspectives (in the perspectives of learning and development, perspectives of internal processes, perspectives of the consumer/user, and in financial perspectives (c Systems and IT solutions of evaluation and measuring performances of the organization in strategic analysis and control.

  13. Theoretical and Methodological Challenges in Measuring Instructional Quality in Mathematics Education Using Classroom Observations

    Science.gov (United States)

    Schlesinger, Lena; Jentsch, Armin

    2016-01-01

    In this article, we analyze theoretical as well as methodological challenges in measuring instructional quality in mathematics classrooms by examining standardized observational instruments. At the beginning, we describe the results of a systematic literature review for determining subject-specific aspects measured in recent lesson studies in…

  14. Using chaos to improve measurement precision

    Institute of Scientific and Technical Information of China (English)

    何斌; 杨灿军; 周银生; 陈鹰

    2002-01-01

    If the measuring signals wore input to the chaotic dynamic system as initial parameters, the system outputs might be in steady state, periodic state or chaos state. If the chaotic dynamic system outputs controlled in the periodic states, the periodic numbers would be changed most with the signals. Our novel method is to add chaotic dynamic vibration to the measurement or sensor system. The sensor sensitivity and precision of a measurement system would be improved with this method. Chaotic dynamics measurement algorithms are given and their sensitivity to parameters are analyzed in this paper. The effects of noises on the system are discussed,

  15. Using chaos to improve measurement precision

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    If the measuring signals were input to the chaotic dynamic system as initial parameters, the system outputs might be in steady state, periodic state or chaos state. If the chaotic dynamic system outputs controlled in the periodic states, the periodic numbers would be changed most with the signals. Our novel method is to add chaotic dynamic vibration to the measurement or sensor system.The sensor sensitivity and precision of a measurement system would be improved with this method. Chaotic dynamics measurement algorithms are given and their sensitivity to parameters are analyzed in this paper. The effects of noises on the system are discussed.

  16. Methodology for the collaboration in supply chains with a focus on continuous improvement

    Directory of Open Access Journals (Sweden)

    José Anselmo Mayer

    2016-08-01

    Full Text Available A collaborative relationship between companies in a supply chain makes it possible to improve both the performance and the results of the companies and of the supply chain. Several studies have analyzed supply chains, but few studies have proposed the application of tools for continuous improvement in a collaborative manner within the supply chain. The objective of this work is to present a methodology for the collaboration in a supply chain with a focus on continuous improvement. Three case studies were conducted with Brazilian multinational focal companies that manufacture technology-based products. It was seen that relationships, trust, the exchange of information, and the sharing of gains and risks sustains collaborative practices focused on continuous improvement. The proposed methodology considers the need for supplier development, for the monitoring of the supplies, and for the development of a partnership for problem solving through the application of tools for continuous improvement.

  17. Improvement measures of urban thermal environment

    CERN Document Server

    Takebayashi, Hideki

    2014-01-01

    Maximizing readers' insights into urban and architectural environmental planning with consideration for the thermal environment, this work highlights how various urban heat-island strategies have been developed and their effectiveness in urban areas. Specific measures to combat the urban heat-island phenomenon, including improvement of surface cover, reduction of exhaust heat, improvement of ventilation are summarized and various heat-island measurement technologies, which have been proposed in recent years, are organized systematically based on surface- heat budget and surface boundary layer

  18. Evaluation and measurement for improvement in service-level quality improvement initiatives.

    Science.gov (United States)

    Russell, Nicholas C C; Wallace, Louise M; Ketley, Diane

    2011-11-01

    The National Health Service (NHS) in England, as with other health services worldwide, currently faces the need to reduce costs and to improve the quality of patient care. Evidence gathered through effective and appropriate measurement and evaluation, is essential to achieving this. Through interviews with service improvement managers and analysis of comments in a seminar of NHS staff involved in health service improvement, we found a lack of understanding regarding the definition and methodology of both measurement and evaluation, which decreases the likelihood that NHS staff will be competent to commission or provide these skills. In addition, we highlight the importance of managers assessing their organizations' 'readiness' to undergo change before embarking on a quality improvement (QI) initiative, to ensure that the initiative's impact can be adequately judged. We provide definitions of measurement for improvement and of evaluation, and propose a comparative framework from which to gauge an appropriate approach. Examples of two large-scale QI initiatives are also given, along with descriptions of some of their problems and solutions, to illustrate the use of the framework. We recommend that health service managers use the framework to determine the most appropriate approach to evaluation and measurement for improvement for their context, to ensure that their decisions are evidence based.

  19. Quality Measures for Improving Technology Trees

    Directory of Open Access Journals (Sweden)

    Teemu J. Heinimäki

    2015-01-01

    Full Text Available The quality of technology trees in digital games can be improved by adjusting their structural and quantitative properties. Therefore, there is a demand for recognizing and measuring such properties. Part of the process can be automated; there are properties measurable by computers, and analyses based on the results (and visualizations of them may help to produce significantly better technology trees, even practically without extra workload for humans. In this paper, we introduce useful technology tree properties and novel measuring features implemented into our software tool for manipulating technology trees.

  20. Design and Measurement Methodology for a Sub-picoampere Current Digitiser

    CERN Document Server

    Voulgari, Evgenia; Anghinolfi, Francis; Krummenacher, François; Kayal, Maher

    2015-01-01

    This paper introduces some design and measurement techniques that were used in the design and the testing of an ASIC for ultra-low current sensing. The idea behind this paper is to present the limitations in sub-picoampere current measurements and demonstrate an ASIC that can accurately measure the different sources of leakage currents and the methodology of measuring. Then the leakage current can be subtracted or compensated in order to accurately measure the ultra-low current that is generated from a sensor/detector. The proposed ASIC can measure currents as low as -50 fA, a value well below similar ASIC implementations.

  1. Quality improvement in neurology: AAN Parkinson disease quality measures

    Science.gov (United States)

    Cheng, E.M.; Tonn, S.; Swain-Eng, R.; Factor, S.A.; Weiner, W.J.; Bever, C.T.

    2010-01-01

    Background: Measuring the quality of health care is a fundamental step toward improving health care and is increasingly used in pay-for-performance initiatives and maintenance of certification requirements. Measure development to date has focused on primary care and common conditions such as diabetes; thus, the number of measures that apply to neurologic care is limited. The American Academy of Neurology (AAN) identified the need for neurologists to develop measures of neurologic care and to establish a process to accomplish this. Objective: To adapt and test the feasibility of a process for independent development by the AAN of measures for neurologic conditions for national measurement programs. Methods: A process that has been used nationally for measure development was adapted for use by the AAN. Topics for measure development are chosen based upon national priorities, available evidence base from a systematic literature search, gaps in care, and the potential impact for quality improvement. A panel composed of subject matter and measure development methodology experts oversees the development of the measures. Recommendation statements and their corresponding level of evidence are reviewed and considered for development into draft candidate measures. The candidate measures are refined by the expert panel during a 30-day public comment period and by review by the American Medical Association for Current Procedural Terminology (CPT) II codes. All final AAN measures are approved by the AAN Board of Directors. Results: Parkinson disease (PD) was chosen for measure development. A review of the medical literature identified 258 relevant recommendation statements. A 28-member panel approved 10 quality measures for PD that included full specifications and CPT II codes. Conclusion: The AAN has adapted a measure development process that is suitable for national measurement programs and has demonstrated its capability to independently develop quality measures. GLOSSARY

  2. Oxygen measurements to improve singlet oxygen dosimetry

    Science.gov (United States)

    Kim, Michele M.; Penjweini, Rozhin; Ong, Yi Hong; Finlay, Jarod C.; Zhu, Timothy C.

    2017-02-01

    Photodynamic therapy (PDT) involves interactions between the three main components of light fluence, photosensitizer concentration, and oxygenation. Currently, singlet oxygen explicit dosimetry (SOED) has focused on the first two of these components. The macroscopic model to calculate reacted singlet oxygen has previously involved a fixed initial ground state oxygen concentration. A phosphorescence-based oxygen probe was used to measure ground state oxygen concentration throughout treatments for mice bearing radioactively induced fibroscarcoma tumors. Photofrin-, BPD-, and HPPH-mediated PDT was performed on mice. Model-calculated oxygen and measured oxygen was compared to evaluate the macroscopic model as well as the photochemical parameters involved. Oxygen measurements at various depths were compared to calculated values. Furthermore, we explored the use of noninvasive diffuse correlation spectroscopy (DCS) to measure tumor blood flow changes in response to PDT to improve the model calculation of reacted singlet oxygen. Mice were monitored after treatment to see the effect of oxygenation on long-term recurrence-free survival as well as the efficacy of using reacted singlet oxygen as a predictive measure of outcome. Measurement of oxygenation during treatment helps to improve SOED as well as confirm the photochemical parameters involved in the macroscopic model. Use of DCS in predicting oxygenation changes was also investigated.

  3. Innovative Methodologies for thermal Energy Release Measurement: case of La Solfatara volcano (Italy)

    Science.gov (United States)

    Marfe`, Barbara; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Marotta, Enrica; Peluso, Rosario

    2015-04-01

    This work is devoted to improve the knowledge on the parameters that control the heat flux anomalies associated with the diffuse degassing processes of volcanic and hydrothermal areas. The methodologies currently used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. A new method, based on the use of thermal imaging cameras, has been applied to estimate the heat flux and its time variations. This approach will allow faster heat flux measurement than already accredited methods, improving in this way the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The idea is to extrapolate the heat flux from the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. We use thermal imaging cameras, at short distances (meters to hundreds of meters), to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature. Preliminary studies have been carried out throughout the whole of the La Solfatara crater in order to investigate a possible correlation between the surface temperature and the shallow thermal gradient. We have used a FLIR SC640 thermal camera and K type thermocouples to assess the two measurements at the same time. Results suggest a good correlation between the shallow temperature gradient ΔTs and the surface temperature Ts depurated from background, and despite the campaigns took place during a period of time of a few years, this correlation seems to be stable over the time. This is an extremely motivating result for a further development of a measurement method based only on the use of small range thermal imaging camera. Surveys with thermal cameras may be manually done using a tripod to take thermal images of small contiguous areas and then joining

  4. Integral methodology simulation support for the improvement of production systems job shop. Metalworking applications in SMES

    Directory of Open Access Journals (Sweden)

    Jaime Alberto Giraldo García

    2010-05-01

    Full Text Available Metalworking companies represent one of the strategic sectors in the regional economy of the Caldas department in Colombia; in fact, this sector is involved in 31% of the department’s industrial establishments and 29% of industrial employment according to DANE (Colombian State Statistical Department statistical data from 2005. The sector also exports to Andean countries. However, preliminary studies conducted with 57% of the entrepreneurs from this sector (excluding micro companies and family businesses have revealed serious structural (technology, processing, installations and infrastructure weaknesses (production planning, quality systems in these organisations’ production systems. It is hoped that this paper will lead to disseminating the results amongst the academic community of implementing a comprehensive methodology for improving the production system of a pilot company from this particular sector. An experimental framework for improving the levels reached by the system regarding such priorities is proposed following universally accepted methodology in discrete simulation studies; it proposes using sequential bifurcation, factorial design and response surface experimentation based on defining and weighting the competing priorities which the company should achieve. The improvements in the pilot company’s production system priorities are presented in terms of an effectiveness index (EI which rose from 1.84 to 2.46 by the end of the study.

  5. Quantification of the effects on greenhouse gas emissions of policies and measures. Methodologies report

    Energy Technology Data Exchange (ETDEWEB)

    Forster, D.; Falconer, A. [AEA Technology, Didcot (United Kingdom); Buttazoni, M.; Greenleaf, J. [Ecofys, Utrecht (Netherlands); Eichhammer, W. [Fraunhofer Institut fuer System- und Innovationsforschung ISI, Karlsruhe (DE)] (and others)

    2009-12-15

    The primary aim of the report is to describe the methodologies that have been developed during the project to evaluate, ex-post, the impact of selected EU Climate Change Policies and Measures (PAMS) on greenhouse gas (GHG) emissions. The secondary aim of the document is to provide guidance to Member State (MS) representatives on ex-post evaluation, and to provide references and tools that facilitate the implementation of a consistent approach across the EU27 countries. The focus of the guidance is on approaches to evaluate the effectiveness of the policies and measures. Evaluating the efficiency of policies is another important component of policy evaluation, but is only considered to a limited extent within these guidelines. Section 2 discusses the broad methodological issues associated with expost evaluation, illustrating the main approaches available and their strengths and weaknesses. Section 3 describes the methodological framework proposed for the evaluation of EU Climate Change Policies, providing explanation of key decisions informing the approach and the actual guidelines for the policy evaluation of individual directives. Section 4 includes policy evaluation guidelines for a large number of different EU climate change policies. Section 5 includes concluding remarks on the role the guidelines could play in EU and MS climate change policy and on their possible future evolution. The Appendices comprise (1) a working Paper on methodological issues related to the calculation of emission factors; (2) case study applications of a Tier 3 methodology.

  6. An improved methodology for dynamic modelling and simulation of electromechanically coupled drive systems: An experimental validation

    Indian Academy of Sciences (India)

    Nuh Erdogan; Humberto Henao; Richard Grisel

    2015-10-01

    The complexity of electromechanical coupling drive system (ECDS)s, specifically electrical drive systems, makes studying them in their entirety challenging since they consist of elements of diverse nature, i.e. electric, electronics and mechanics. This presents a real struggle to the engineers who want to design and implement such systems with high performance, efficiency and reliability. For this purpose, engineers need a tool capable of modelling and/or simulating components of diverse nature within the ECDS. However, a majority of the available tools are limited in their capacity to describe the characteristics of such components sufficiently. To overcome this difficulty, this paper first proposes an improved methodology of modelling and simulation for ECDS. The approach is based on using domain-based simulators individually, namely electric and mechanic part simulators and also integrating them with a co-simulation. As for the modelling of the drive machine, a finely tuned dynamic model is developed by taking the saturation effect into account. In order to validate the developed model as well as the proposed methodology, an industrial ECDS is tested experimentally. Later, both the experimental and simulation results are compared to prove the accuracy of the developed model and the relevance of the proposed methodology.

  7. A methodology for the analysis and improvement of a firm´s competitiveness

    Directory of Open Access Journals (Sweden)

    Jose Celso Contador

    2006-01-01

    Full Text Available This paper presents a new methodology for the analysis of a group of companies, aiming at explaining and increasing a firm´s competitiveness. Based on the model of the fields and weapons of the competition, the methodology distinguishes between business and operational competitive strategies. The first consists of some of the 15 fields of the competition, and the latter consists of the weapons of the competition. Competitiveness is explained through the application of several mathematical variables. The influence of the competitive strategies is statistically evaluated using the Wilcoxon-Mann-Whitney non-parametric test, the t-test, and Pearson´s correlation. The methodology was applied to companies belonging to the textil e pole of Americana; one of the conclusions reached is that what explains competitiveness is the operational strategy rather than the business strategy. Therefore, to improve competitiveness, a company must intensify its focus on weapons that are relevant to the fields where it decided to compete.

  8. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    Science.gov (United States)

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  9. Eureka: A methodology for measuring bandwidth usage of networked games, environments and applications

    NARCIS (Netherlands)

    I. Vaishnavi (Ishan); A. Arefin; D.C.A. Bulterman (Dick); K. Nahrstedt; R. Rivas

    2010-01-01

    htmlabstractThis paper presents Eureka: a generic methodology of measuring the instantaneous (per second) bandwidth usage of networked games and applications in run time. Eureka starts with constructing a priority queue and sending low priority traffic through it. Then, the application under study i

  10. Computer Science and Technology: Measurement of Interative Computing: Methodology and Application.

    Science.gov (United States)

    Cotton, Ira W.

    This dissertation reports the development and application of a new methodology for the measurement and evaluation of interactive computing, applied to either the users of an interactive computing system or to the system itself, including the service computer and any communications network through which the service is delivered. The focus is on the…

  11. Measuring hand hygiene compliance rates in different special care settings: a comparative study of methodologies

    Directory of Open Access Journals (Sweden)

    Thyago Pereira Magnus

    2015-04-01

    Conclusions: Hand hygiene compliance was reasonably high in these units, as measured by direct observation. However, a lack of correlation with results obtained by other methodologies brings into question the validity of direct observation results, and suggests that periodic audits using other methods may be needed.

  12. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    Science.gov (United States)

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  13. Measuring Discrimination in Education: Are Methodologies from Labor and Markets Useful?

    Science.gov (United States)

    Holzer, Harry J.; Ludwig, Jens

    2003-01-01

    Reviews the methodologies most frequently used by social scientists when measuring discrimination in housing and labor markets, assessing their usefulness for analyzing discrimination in education. The paper focuses on standard statistical methods, methods using more complete data, experimental/audit methods, and natural experiments based on…

  14. Extended Axiomatic Conjoint Measurement: A Solution to a Methodological Problem in Studying Fertility-Related Behaviors.

    Science.gov (United States)

    Nickerson, Carol A.; McClelland, Gary H.

    1988-01-01

    A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)

  15. Improved wavefront reconstruction algorithm from slope measurements

    Science.gov (United States)

    Phuc, Phan Huy; Manh, Nguyen The; Rhee, Hyug-Gyo; Ghim, Young-Sik; Yang, Ho-Soon; Lee, Yun-Woo

    2017-03-01

    In this paper, we propose a wavefront reconstruction algorithm from slope measurements based on a zonal method. In this algorithm, the slope measurement sampling geometry used is the Southwell geometry, in which the phase values and the slope data are measured at the same nodes. The proposed algorithm estimates the phase value at a node point using the slope measurements of eight points around the node, as doing so is believed to result in better accuracy with regard to the wavefront. For optimization of the processing time, a successive over-relaxation method is applied to iteration loops. We use a trial-and-error method to determine the best relaxation factor for each type of wavefront in order to optimize the iteration time and, thus, the processing time of the algorithm. Specifically, for a circularly symmetric wavefront, the convergence rate of the algorithm can be improved by using the result of a Fourier Transform as an initial value for the iteration. Various simulations are presented to demonstrate the improvements realized when using the proposed algorithm. Several experimental measurements of deflectometry are also processed by using the proposed algorithm.

  16. Measuring populations to improve vaccination coverage

    Science.gov (United States)

    Bharti, Nita; Djibo, Ali; Tatem, Andrew J.; Grenfell, Bryan T.; Ferrari, Matthew J.

    2016-10-01

    In low-income settings, vaccination campaigns supplement routine immunization but often fail to achieve coverage goals due to uncertainty about target population size and distribution. Accurate, updated estimates of target populations are rare but critical; short-term fluctuations can greatly impact population size and susceptibility. We use satellite imagery to quantify population fluctuations and the coverage achieved by a measles outbreak response vaccination campaign in urban Niger and compare campaign estimates to measurements from a post-campaign survey. Vaccine coverage was overestimated because the campaign underestimated resident numbers and seasonal migration further increased the target population. We combine satellite-derived measurements of fluctuations in population distribution with high-resolution measles case reports to develop a dynamic model that illustrates the potential improvement in vaccination campaign coverage if planners account for predictable population fluctuations. Satellite imagery can improve retrospective estimates of vaccination campaign impact and future campaign planning by synchronizing interventions with predictable population fluxes.

  17. Measuring populations to improve vaccination coverage

    Science.gov (United States)

    Bharti, Nita; Djibo, Ali; Tatem, Andrew J.; Grenfell, Bryan T.; Ferrari, Matthew J.

    2016-01-01

    In low-income settings, vaccination campaigns supplement routine immunization but often fail to achieve coverage goals due to uncertainty about target population size and distribution. Accurate, updated estimates of target populations are rare but critical; short-term fluctuations can greatly impact population size and susceptibility. We use satellite imagery to quantify population fluctuations and the coverage achieved by a measles outbreak response vaccination campaign in urban Niger and compare campaign estimates to measurements from a post-campaign survey. Vaccine coverage was overestimated because the campaign underestimated resident numbers and seasonal migration further increased the target population. We combine satellite-derived measurements of fluctuations in population distribution with high-resolution measles case reports to develop a dynamic model that illustrates the potential improvement in vaccination campaign coverage if planners account for predictable population fluctuations. Satellite imagery can improve retrospective estimates of vaccination campaign impact and future campaign planning by synchronizing interventions with predictable population fluxes. PMID:27703191

  18. Improved control of delayed measured systems

    Science.gov (United States)

    Claussen, Jens Christian; Schuster, Heinz Georg

    2004-11-01

    In this paper, we address the question of how the control of delayed measured chaotic systems can be improved. Both unmodified Ott-Grebogi-Yorke control and difference control can be successfully applied only for a certain range of Lyapunov numbers depending on the delay time. We show that this limitation can be overcome by at least two classes of methods, namely, by rhythmic control and by the memory methods of linear predictive logging control and memory difference control.

  19. An improved glyoxal retrieval from OMI measurements

    OpenAIRE

    Alvarado, L. M. A.; Richter, A; M. Vrekoussis; Wittrock, F; Hilboll, A.; S. F. Schreier; Burrows, J.P.

    2014-01-01

    Satellite observations from the SCIAMACHY, GOME-2, and OMI spectrometers have been used to retrieve atmospheric columns of glyoxal (CHOCHO) with the DOAS method. High CHOCHO levels are found over regions with large biogenic and pyrogenic emissions, and hot-spots have been identified over areas of anthropogenic activities. This study focuses on the development of an improved retrieval for CHOCHO from measurements by the OMI instrument. From sensitivi...

  20. Improving operating room efficiency in academic children's hospital using Lean Six Sigma methodology.

    Science.gov (United States)

    Tagge, Edward P; Thirumoorthi, Arul S; Lenart, John; Garberoglio, Carlos; Mitchell, Kenneth W

    2017-06-01

    Lean Six Sigma (LSS) is a process improvement methodology that utilizes a collaborative team effort to improve performance by systematically identifying root causes of problems. Our objective was to determine whether application of LSS could improve efficiency when applied simultaneously to all services of an academic children's hospital. In our tertiary academic medical center, a multidisciplinary committee was formed, and the entire perioperative process was mapped, using fishbone diagrams, Pareto analysis, and other process improvement tools. Results for Children's Hospital scheduled main operating room (OR) cases were analyzed, where the surgical attending followed themselves. Six hundred twelve cases were included in the seven Children's Hospital operating rooms (OR) over a 6-month period. Turnover Time (interval between patient OR departure and arrival of the subsequent patient) decreased from a median 41min in the baseline period to 32min in the intervention period (p<0.0001). Turnaround Time (interval between surgical dressing application and subsequent surgical incision) decreased from a median 81.5min in the baseline period to 71min in the intervention period (p<0.0001). These results demonstrate that a coordinated multidisciplinary process improvement redesign can significantly improve efficiency in an academic Children's Hospital without preselecting specific services, removing surgical residents, or incorporating new personnel or technology. Prospective comparative study, Level II. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Methodology for the use of proportional counters in pulsed fast neutron yield measurements

    OpenAIRE

    Tarifeño-Saldivia, Ariel; Mayer, Roberto E.; Pavez, Cristian; Soto, Leopoldo

    2011-01-01

    This paper introduces in full detail a methodology for the measurement of neutron yield and the necessary efficiency calibration, to be applied to the intensity measurement of neutron bursts where individual neutrons are not resolved in time, for any given moderated neutron proportional counter array. The method allows efficiency calibration employing the detection neutrons arising from an isotopic neutron source. Full statistical study of the procedure is descripted, taking into account cont...

  2. Using lean methodology to teach quality improvement to internal medicine residents at a safety net hospital.

    Science.gov (United States)

    Weigel, Charlene; Suen, Winnie; Gupte, Gouri

    2013-01-01

    The overall objective of this initiative was to develop a quality improvement (QI) curriculum using Lean methodology for internal medicine residents at Boston Medical Center, a safety net academic hospital. A total of 90 residents and 8 School of Public Health students participated in a series of four, 60- to 90-minute interactive and hands-on QI sessions. Seventeen QI project plans were created and conducted over a 4-month period. The curriculum facilitated internal medicine residents' learning about QI and development of positive attitudes toward QI (assessed using pre- and post-attitude surveys) and exposed them to an interprofessional team structure that duplicates future working relationships. This QI curriculum can be an educational model of how health care trainees can work collaboratively to improve health care quality.

  3. [Improvement in the efficiency of a rehabilitation service using Lean Healthcare methodology].

    Science.gov (United States)

    Pineda Dávila, S; Tinoco González, J

    2015-01-01

    The aim of this study was to evaluate the reduction in costs and the increase in time devoted to the patient, by applying Lean Healthcare methodology. A multidisciplinary team was formed, setting up three potential areas for improvement by performing a diagnostic process, including the storage and standardization of materials, and professional tasks in the therapeutic areas, by implementing three Lean tools: kanban, 5S and 2P. Stored material costs decreased by 43%, the cost of consumables per patient treated by 19%, and time dedicated to patient treatment increased by 7%. The processes were standardized and "muda" (wastefulness) was eliminated, thus reducing costs and increasing the value to the patient. All this demonstrates that it is possible to apply tools of industrial origin to the health sector, with the aim of improving the quality of care and achieve maximum efficiency. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  4. Innovative methodology for electrical conductivity measurements and metal partition in biosolid pellets

    Science.gov (United States)

    Jordan, Manuel Miguel; Rincón-Mora, Beatriz; Belén Almendro-Candel, María; Navarro-Pedreño, José; Gómez-Lucas, Ignacio; Bech, Jaume

    2017-04-01

    Use of biosolids to improve the nutrient content in a soil is a common practice. The obligation to restore abandoned mine and the correct application of biosolids is guaranteed by the legislation on waste management, biosolids and soil conservation (Jordán et al. 2008). The present research was conducted to determine electrical conductivity in dry wastes (pellets) using a innovative methodology (Camilla and Jordán, 2009). On the other hand, the present study was designed to examine the distribution of selected heavy metals in biosolid pellets, and also to relate the distribution patterns of these metals. In this context, heavy metal concentrations were studied in biosolid pellets under different pressures. Electrical conductivity measurements were taken in biosolid pellets under pressures on the order of 50 to 150 MPa and with currents of 10-15 A. Measurements of electrical conductivity and heavy metal content for different areas (H1, H2, and H3) were taken. Total content of metals was determined following microwave digestion and analysed by ICP/MS. Triplicate portions were weighed in polycarbonate centrifuge tubes and sequentially extracted. The distribution of chemical forms of Cd, Ni, Cr, and Pb in the biosolids was studied using a sequential extraction procedure that fractionates the metal into soluble-exchangeable, specifically sorbed-carbonate bound, oxidizable, reducible, and residual forms. The residual, reducible, and carbonate-sorbed forms were dominant. Higher Cr and Ni content were detected in pellets made with biosolids from the H3. The highest Cd and Ni values were detected in the H2. The trends of the conductivity curves were similar for the sludge from the isolation surface (H1) and for the mesophilous area (H2). In the case of the thermophilous area (H3), the electrical conductivity showed extremely high values. This behaviour was similar in the case of the Cr and Ni content. However, in the case of Cd and Pb, the highest values were detected in

  5. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews

    Directory of Open Access Journals (Sweden)

    Hamel Candyce

    2007-02-01

    Full Text Available Abstract Background Our objective was to develop an instrument to assess the methodological quality of systematic reviews, building upon previous tools, empirical evidence and expert consensus. Methods A 37-item assessment tool was formed by combining 1 the enhanced Overview Quality Assessment Questionnaire (OQAQ, 2 a checklist created by Sacks, and 3 three additional items recently judged to be of methodological importance. This tool was applied to 99 paper-based and 52 electronic systematic reviews. Exploratory factor analysis was used to identify underlying components. The results were considered by methodological experts using a nominal group technique aimed at item reduction and design of an assessment tool with face and content validity. Results The factor analysis identified 11 components. From each component, one item was selected by the nominal group. The resulting instrument was judged to have face and content validity. Conclusion A measurement tool for the 'assessment of multiple systematic reviews' (AMSTAR was developed. The tool consists of 11 items and has good face and content validity for measuring the methodological quality of systematic reviews. Additional studies are needed with a focus on the reproducibility and construct validity of AMSTAR, before strong recommendations can be made on its use.

  6. Improving Efficiency Using Time-Driven Activity-Based Costing Methodology.

    Science.gov (United States)

    Tibor, Laura C; Schultz, Stacy R; Menaker, Ronald; Weber, Bradley D; Ness, Jay; Smith, Paula; Young, Phillip M

    2017-03-01

    The aim of this study was to increase efficiency in MR enterography using a time-driven activity-based costing methodology. In February 2015, a multidisciplinary team was formed to identify the personnel, equipment, space, and supply costs of providing outpatient MR enterography. The team mapped the current state, completed observations, performed timings, and calculated costs associated with each element of the process. The team used Pareto charts to understand the highest cost and most time-consuming activities, brainstormed opportunities, and assessed impact. Plan-do-study-act cycles were developed to test the changes, and run charts were used to monitor progress. The process changes consisted of revising the workflow associated with the preparation and administration of glucagon, with completed implementation in November 2015. The time-driven activity-based costing methodology allowed the radiology department to develop a process to more accurately identify the costs of providing MR enterography. The primary process modification was reassigning responsibility for the administration of glucagon from nurses to technologists. After implementation, the improvements demonstrated success by reducing non-value-added steps and cost by 13%, staff time by 16%, and patient process time by 17%. The saved process time was used to augment existing examination time slots to more accurately accommodate the entire enterographic examination. Anecdotal comments were captured to validate improved staff satisfaction within the multidisciplinary team. This process provided a successful outcome to address daily workflow frustrations that could not previously be improved. A multidisciplinary team was necessary to achieve success, in addition to the use of a structured problem-solving approach. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. Can the reliability of three-dimensional running kinematics be improved using functional joint methodology?

    Science.gov (United States)

    Pohl, Michael B; Lloyd, Chandra; Ferber, Reed

    2010-10-01

    Traditional three-dimensional gait analyses require the skilled palpation of anatomical landmarks to identify joint parameters and produce reliable joint kinematics. Functional methods have been developed to help improve the reliability and validity of identifying joint kinematic parameters. The purpose of this study was to investigate whether a functional method could improve the between-day reliability of joint kinematics during running compared to a traditional manual marker placement method. It was hypothesised that the functional technique would result in greater within- and between-tester reliability. An eight-camera motion analysis system was used to evaluate the reliability of 3D lower extremity kinematics during running for both a functional and a manual marker placement technique. Reliability of the waveform shape, amplitude and offset of the kinematic curves was assessed using the coefficient of multiple correlation, range of motion and root mean square error respectively. The functional joint methodology did not improve the within- and between-tester reliability in terms of kinematic curve shape, amplitude or offset compared to the manual placement technique. When experienced examiners are used to place the anatomical markers together with a lean subject sample, functional methods may not improve the day-to-day reliability of three-dimensional gait kinematics over traditional marker placement techniques.

  8. Measure to succeed: How to improve employee participation in continuous improvement

    Directory of Open Access Journals (Sweden)

    Daniel Jurburg

    2016-12-01

    Full Text Available Purpose: Achieving employee participation in continuous improvement (CI systems is considered as one of the success factors for the sustainability of those systems. Yet, it is also very difficult to obtain because of the interaction of many critical factors that affect employee participation. Therefore, finding ways of measuring all these critical factors can help practitioners manage the employee participation process accordingly. Design/methodology/approach: Based upon the existing literature, this paper presents a 4-Phase (9 steps diagnostic tool to measure the main determinants associated with the implementation of CI systems affecting employee participation in improvement activities. Findings: The tool showed its usefulness to detect the main weaknesses and improvement opportunities for improving employee participation in CI through the application in two different cases. Practical implications: This diagnostic tool could be particularly interesting for companies adopting CI and other excellence frameworks, which usually include a pillar related to people development inside the organization, but do not include tools to diagnose the state of this pillar. Originality/value: This diagnostic tool presents a user’s perspective approach, ensuring that the weaknesses and improvement opportunities detected during the diagnose come directly from the users of the CI system, which in this case are the employees themselves. Given that the final objective is to identify reasons and problems hindering employee participation, adopting this user’s perspective approach seem more relevant than adopting other more traditional approaches, based on gathering information from the CI system itself or from the CI managers.

  9. Improving Localization Accuracy: Successive Measurements Error Modeling

    Directory of Open Access Journals (Sweden)

    Najah Abu Ali

    2015-07-01

    Full Text Available Vehicle self-localization is an essential requirement for many of the safety applications envisioned for vehicular networks. The mathematical models used in current vehicular localization schemes focus on modeling the localization error itself, and overlook the potential correlation between successive localization measurement errors. In this paper, we first investigate the existence of correlation between successive positioning measurements, and then incorporate this correlation into the modeling positioning error. We use the Yule Walker equations to determine the degree of correlation between a vehicle’s future position and its past positions, and then propose a -order Gauss–Markov model to predict the future position of a vehicle from its past  positions. We investigate the existence of correlation for two datasets representing the mobility traces of two vehicles over a period of time. We prove the existence of correlation between successive measurements in the two datasets, and show that the time correlation between measurements can have a value up to four minutes. Through simulations, we validate the robustness of our model and show that it is possible to use the first-order Gauss–Markov model, which has the least complexity, and still maintain an accurate estimation of a vehicle’s future location over time using only its current position. Our model can assist in providing better modeling of positioning errors and can be used as a prediction tool to improve the performance of classical localization algorithms such as the Kalman filter.

  10. Improving respiration measurements with gas exchange analyzers.

    Science.gov (United States)

    Montero, R; Ribas-Carbó, M; Del Saz, N F; El Aou-Ouad, H; Berry, J A; Flexas, J; Bota, J

    2016-12-01

    Dark respiration measurements with open-flow gas exchange analyzers are often questioned for their low accuracy as their low values often reach the precision limit of the instrument. Respiration was measured in five species, two hypostomatous (Vitis Vinifera L. and Acanthus mollis) and three amphistomatous, one with similar amount of stomata in both sides (Eucalyptus citriodora) and two with different stomata density (Brassica oleracea and Vicia faba). CO2 differential (ΔCO2) increased two-fold with no change in apparent Rd, when the two leaves with higher stomatal density faced outside. These results showed a clear effect of the position of stomata on ΔCO2. Therefore, it can be concluded that leaf position is important to guarantee the improvement of respiration measurements increasing ΔCO2 without affecting the respiration results by leaf or mass units. This method will help to increase the accuracy of leaf respiration measurements using gas exchange analyzers. Copyright © 2016 Elsevier GmbH. All rights reserved.

  11. Linking outcomes management and practice improvement. Structured care methodologies: evolution and use in patient care delivery.

    Science.gov (United States)

    Cole, L; Houston, S

    1999-01-01

    Structured care methodologies are tools that provide a comprehensive approach to patient care delivery. These tools have evolved in their application and purpose over the years. In many situations, multiple tools are needed to obtain the best outcomes for a patient. The presence of a SCM does not preclude clinical judgment. On the contrary, the fundamental purpose of any SCM is to assist practitioners in implementing practice patterns associated with good clinical judgment, research-based interventions, and improved patient outcomes. These tools support smooth operation and appropriate use of resources, establish a means of patient management across the continuum of care, facilitate collaboration among disciplines, reflect patient outcomes, and provide outcomes data. Data from SCMs permit benchmarking, comparison of pre-implementation and post-implementation outcomes, development of action plans for quality enhancement, identification of high-risk patients, identification of issues and problems in the system that require interventions, and the development of research protocols and studies. Structured care methodology development and implementation can be challenging, rewarding, and at times frustrating. When used appropriately, these tools can have a major impact on the standardization of care and the achievement of desired outcomes. However, individual patient needs may supersede adherence to a tool. The challenge then becomes one of balancing the unique needs of each patient and appropriate use of SCMs. Change comes slowly, but persistence pays off.

  12. An improved glyoxal retrieval from OMI measurements

    Directory of Open Access Journals (Sweden)

    L. M. A. Alvarado

    2014-06-01

    Full Text Available Satellite observations from the SCIAMACHY, GOME-2, and OMI spectrometers have been used to retrieve atmospheric columns of glyoxal (CHOCHO with the DOAS method. High CHOCHO levels are found over regions with large biogenic and pyrogenic emissions, and hot-spots have been identified over areas of anthropogenic activities. This study focuses on the development of an improved retrieval for CHOCHO from measurements by the OMI instrument. From sensitivity tests, an optimal fitting window and polynomial degree are determined. Two different approaches to reduce the interference of liquid water absorption over oceanic regions are evaluated, achieving significant reduction of negative columns over clear water regions. Moreover, a high temperature absorption cross-section of nitrogen dioxide (NO2 is introduced in the DOAS retrieval to account for potential interferences of NO2 over regions with large anthropogenic emissions, leading to improved fit quality over these areas. A comparison with vertical CHOCHO columns retrieved from measurements of the GOME-2 and SCIAMACHY instruments over continental regions is performed, showing overall good consistency. Using the new OMI CHOCHO data set, the link between fires and glyoxal columns is investigated for two selected regions in Africa. In addition, mapped averages are computed for a fire event in the east of Moscow between mid-July and mid-August 2010. In both cases, enhanced CHOCHO levels are found in close spatial and temporal proximity to MODIS fire radiative power, demonstrating that pyrogenic emissions can be clearly identified in the OMI CHOCHO product.

  13. Juncture flow improvement for wing/pylon configurations by using CFD methodology

    Science.gov (United States)

    Gea, Lie-Mine; Chyu, Wei J.; Stortz, Michael W.; Chow, Chuen-Yen

    1993-01-01

    Transonic flow field around a fighter wing/pylon configuration was simulated by using an implicit upwinding Navier-Stokes flow solver (F3D) and overset grid technology (Chimera). Flow separation and local shocks near the wing/pylon junction were observed in flight and predicted by numerical calculations. A new pylon/fairing shape was proposed to improve the flow quality. Based on numerical results, the size of separation area is significantly reduced and the onset of separation is delayed farther downstream. A smoother pressure gradient is also obtained near the junction area. This paper demonstrates that computational fluid dynamics (CFD) methodology can be used as a practical tool for aircraft design.

  14. Optimization of a novel improver gel formulation for Barbari flat bread using response surface methodology.

    Science.gov (United States)

    Pourfarzad, Amir; Haddad Khodaparast, Mohammad Hossein; Karimi, Mehdi; Mortazavi, Seyed Ali

    2014-10-01

    Nowadays, the use of bread improvers has become an essential part of improving the production methods and quality of bakery products. In the present study, the Response Surface Methodology (RSM) was used to determine the optimum improver gel formulation which gave the best quality, shelf life, sensory and image properties for Barbari flat bread. Sodium stearoyl-2-lactylate (SSL), diacetyl tartaric acid esters of monoglyceride (DATEM) and propylene glycol (PG) were constituents of the gel and considered in this study. A second-order polynomial model was fitted to each response and the regression coefficients were determined using least square method. The optimum gel formulation was found to be 0.49 % of SSL, 0.36 % of DATEM and 0.5 % of PG when desirability function method was applied. There was a good agreement between the experimental data and their predicted counterparts. Results showed that the RSM, image processing and texture analysis are useful tools to investigate, approximate and predict a large number of bread properties.

  15. Improved gamma bang time measurements on omega

    Energy Technology Data Exchange (ETDEWEB)

    Herrmann, H W; Caldwell, S E; Evans, S C; Mack, J M; Sanchez, P; Sedillo, T; Wilson, D C; Young, C S [Los Alamos National Laboratory, Los Alamos, New Mexico (United States); Drew, D; Horsfield, C J [Atomic Weapons Establishment, Aldermaston, Reading, Berkshire, RG7 4PR (United Kingdom); Glebov, V Y; Stoeckl, C [Laboratory for Laser Energetics, University of Rochester, Rochester, NY (United States); Macrum, G S; Miller, E K [National Security Technologies/Special Technologies Lab, Santa Barbara, CA (United States)], E-mail: herrmann@lanl.gov

    2008-05-15

    The time of peak fusion reactivity with respect to the impingement of laser light on an Inertial Confinement Fusion (ICF) capsule is known as Bang Time (BT). For deuterium-tritium fueling, fusion reactivity and BT can be measured using either fusion neutrons or fusion gammas. Initial gamma bang time (GBT) measurements on Omega using a Gas Cherenkov Detector (GCD) have been previously reported. Recent improvements have significantly enhanced the ability to measure GBT precisely. By relating the peak of the GCD gamma signal to laser timing fiducials, and cross calibrating the resulting raw bang time to the neutron bang time obtained using the absolutely calibrated Neutron Temporal Diagnostic (NTD), we demonstrate a precision of better than 25 ps on Omega. Bang time, along with other aspects of reaction history (RH), is an essential component of diagnosing failed attempts at ICF ignition. For the NIF, gammas are preferred over neutrons for this application due to the unacceptably large neutron temporal spreading resulting from detector standoff limitations on the NIF. The NIF System Design Requirement specifies a gamma bang time accuracy of better than 50 ps.

  16. Impact of volunteer-related and methodology-related factors on the reproducibility of brachial artery flow-mediated vasodilation: analysis of 672 individual repeated measurements.

    NARCIS (Netherlands)

    Mil, A.C.C.M. van; Greyling, A.; Zock, P.L.; Geleijnse, J.M.; Hopman, M.T.E.; Mensink, R.P.; Reesink, K.D.; Green, D.J.; Ghiadoni, L.; Thijssen, D.H.J.

    2016-01-01

    OBJECTIVES: Brachial artery flow-mediated dilation (FMD) is a popular technique to examine endothelial function in humans. Identifying volunteer and methodological factors related to variation in FMD is important to improve measurement accuracy and applicability. METHODS: Volunteer-related and

  17. A Methodology to Measure Synergy Among Energy-Efficiency Programs at the Program Participant Level

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E.

    2003-11-14

    This paper presents a methodology designed to measure synergy among energy-efficiency programs at the program participant level (e.g., households, firms). Three different definitions of synergy are provided: strong, moderate, and weak. Data to measure synergy can be collected through simple survey questions. Straightforward mathematical techniques can be used to estimate the three types of synergy and explore relative synergistic impacts of different subsets of programs. Empirical research is needed to test the concepts and methods and to establish quantitative expectations about synergistic relationships among programs. The market for new energy-efficient motors is the context used to illustrate all the concepts and methods in this paper.

  18. Performance evaluation of CT measurements made on step gauges using statistical methodologies

    DEFF Research Database (Denmark)

    Angel, J.; De Chiffre, L.; Kruth, J.P.;

    2015-01-01

    In this paper, a study is presented in which statistical methodologies were applied to evaluate the measurement of step gauges on an X-ray computed tomography (CT) system. In particular, the effects of step gauge material density and orientation were investigated. The step gauges consist of uni......- and bidirectional lengths. By confirming the repeatability of measurements made on the test system, the number of required scans in the design of experiment (DOE) was reduced. The statistical model was checked using model adequacy principles; model adequacy checking is an important step in validating...

  19. Direct sample positioning and alignment methodology for strain measurement by diffraction

    Science.gov (United States)

    Ratel, N.; Hughes, D. J.; King, A.; Malard, B.; Chen, Z.; Busby, P.; Webster, P. J.

    2005-05-01

    An ISO (International Organization for Standardization) TTA (Technology Trends Assessment) was published in 2001 for the determination of residual stress using neutron diffraction which identifies sample alignment and positioning as a key source of strain measurement error. Although the measurement uncertainty by neutron and synchrotron x-ray diffraction for an individual measurement of lattice strain is typically of the order of 10-100×10-6, specimens commonly exhibit strain gradients of 1000×10-6mm-1 or more, making sample location a potentially considerable source of error. An integrated approach to sample alignment and positioning is described which incorporates standard base-plates and sample holders, instrument alignment procedures, accurate digitization using a coordinate measuring machine and automatic generation of instrument control scripts. The methodology that has been developed is illustrated by the measurement of the transverse residual strain field in a welded steel T-joint using neutrons.

  20. An improved glyoxal retrieval from OMI measurements

    Science.gov (United States)

    Alvarado, L. M. A.; Richter, A.; Vrekoussis, M.; Wittrock, F.; Hilboll, A.; Schreier, S. F.; Burrows, J. P.

    2014-12-01

    Satellite observations from the SCIAMACHY, GOME-2 and OMI spectrometers have been used to retrieve atmospheric columns of glyoxal (CHOCHO) with the DOAS method. High CHOCHO levels were found over regions with large biogenic and pyrogenic emissions, and hot-spots have been identified over areas of anthropogenic activities. This study focuses on the development of an improved retrieval for CHOCHO from measurements by the OMI instrument. From sensitivity tests, a fitting window and a polynomial degree are determined. Two different approaches to reduce the interference of liquid water absorption over oceanic regions are evaluated, achieving significant reduction of the number of negative columns over clear water regions. The impact of using different absorption cross-sections for water vapour is evaluated and only small differences are found. Finally, a high-temperature (boundary layer ambient: 294 K) absorption cross-section of nitrogen dioxide (NO2) is introduced in the DOAS retrieval to account for potential interferences of NO2 over regions with large anthropogenic emissions, leading to improved fit quality over these areas. A comparison with vertical CHOCHO columns retrieved from GOME-2 and SCIAMACHY measurements over continental regions is performed, showing overall good consistency. However, SCIAMACHY CHOCHO columns are systematically higher than those obtained from the other instruments. Using the new OMI CHOCHO data set, the link between fires and glyoxal columns is investigated for two selected regions in Africa. In addition, mapped averages are computed for a fire event in Russia between mid-July and mid-August 2010. In both cases, enhanced CHOCHO levels are found in close spatial and temporal proximity to elevated levels of MODIS fire radiative power, demonstrating that pyrogenic emissions can be clearly identified in the new OMI CHOCHO product.

  1. Giada improved calibration of measurement subsystems

    Science.gov (United States)

    Della Corte, V.; Rotundi, A.; Sordini, R.; Accolla, M.; Ferrari, M.; Ivanovski, S.; Lucarelli, F.; Mazzotta Epifani, E.; Palumbo, P.

    2014-12-01

    GIADA (Grain Impact Analyzer and Dust Accumulator) is an in-situ instrument devoted to measure the dynamical properties of the dust grains emitted by the comet. An Extended Calibration activity using the GIADA Flight Spare Model has been carried out taking into account the knowledge gained through the analyses of IDPs and cometary samples returned from comet 81P/Wild 2. GIADA consists of three measurement subsystems: Grain Detection System, an optical device measuring the optical cross-section for individual dust; Impact Sensor an aluminum plate connected to 5 piezo-sensors measuring the momentum of impacting single dust grains; Micro Balance System measuring the cumulative deposition in time of dust grains smaller than 10 μm. The results of the analyses on data acquired with the GIADA PFM and the comparison with calibration data acquired during the pre-launch campaign allowed us to improve GIADA performances and capabilities. We will report the results of the following main activities: a) definition of a correlation between the 2 GIADA Models (PFM housed in laboratory and In-Flight Model on-board ROSETTA); b) characterization of the sub-systems performances (signal elaboration, sensitivities, space environment effects); c) new calibration measurements and related curves by means of the PFM model using realistic cometary dust analogues. Acknowledgements: GIADA was built by a consortium led by the Univ. Napoli "Parthenope" & INAF-Oss. Astr. Capodimonte, IT, in collaboration with the Inst. de Astrofisica de Andalucia, ES, Selex-ES s.p.a. and SENER. GIADA is presently managed & operated by Ist. di Astrofisica e Planetologia Spaziali-INAF, IT. GIADA was funded and managed by the Agenzia Spaziale Italiana, IT, with a support of the Spanish Ministry of Education and Science MEC, ES. GIADA was developed from a University of Kent, UK, PI proposal; sci. & tech. contribution given by CISAS, IT, Lab. d'Astr. Spat., FR, and Institutions from UK, IT, FR, DE and USA. We thank

  2. Application of Lean Healthcare methodology in a urology department of a tertiary hospital as a tool for improving efficiency.

    Science.gov (United States)

    Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D

    2017-07-01

    To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. Measurement of rill erosion through a new UAV-GIS methodology

    Directory of Open Access Journals (Sweden)

    Paolo Bazzoffi

    2015-11-01

    Full Text Available Photogrammetry from aerial pictures acquired through micro Unmanned Aerial Vehicles (UAV, integrated by post-processing is a promising methodology both in terms of speed of data acquisition, degree of automation of data processing and cost-effectiveness. The new UAV-GIS methodology has been developed for three main purposes: i for a quick measurement of rill erosion at a field scale with the aim of combining the simplicity of field survey to reliability of results, at an affordable price; ii to calibrate the RUSLE model to make it suitable for the purposes of the CAP common indicator; iii to provide an easy evaluation tool to Regions and to non-research professionals who use the very popular ESRI ArcGis software for assessing the effectiveness of soil conservation measures adopted under CAP and to calibrate the common indicator “soil erosion by water”. High-resolution stereo photos pairs, acquired close to the soil, are of crucial importance in order to produce high resolution DEMs to be analysed under GIS. The GIS methodology consists of the measurement of rill erosion that occurred in a plot from the total volume of the incisions, regardless of internal sediment redeposition, based on Plan Curvature analysis and Focal Statistics analysis, described in detail, as they are the essential constituents of the new methodology. To determine the effectiveness and reliability of the new methodology a comparison between rill depth measured manually on field of 51 rill points and depth measured by UAV-GIS methodology was done. The best calibration equation was obtained by using 30 cm radius in the Focal statistics analysis. The linear regression equation resulted highly significant with R2 =0.87. Two case studies are presented, solved step by step, in order to help the user to overcome possible difficulties of interpretation in the application of the GIS procedure. The first solved exercise concerns a heavily eroded plot where only one DEM, derived

  4. A statistical methodology to improve accuracy in differentiating schizophrenia patients from healthy controls.

    Science.gov (United States)

    Peters, Rosalind M; Gjini, Klevest; Templin, Thomas N; Boutros, Nash N

    2014-05-30

    We present a methodology to statistically discriminate among univariate and multivariate indices to improve accuracy in differentiating schizophrenia patients from healthy controls. Electroencephalogram data from 71 subjects (37 controls/34 patients) were analyzed. Data included P300 event-related response amplitudes and latencies as well as amplitudes and sensory gating indices derived from the P50, N100, and P200 auditory-evoked responses resulting in 20 indices analyzed. Receiver operator characteristic (ROC) curve analyses identified significant univariate indices; these underwent principal component analysis (PCA). Logistic regression of PCA components created a multivariate composite used in the final ROC. Eleven univariate ROCs were significant with area under the curve (AUC) >0.50. PCA of these indices resulted in a three-factor solution accounting for 76.96% of the variance. The first factor was defined primarily by P200 and P300 amplitudes, the second by P50 ratio and difference scores, and the third by P300 latency. ROC analysis using the logistic regression composite resulted in an AUC of 0.793 (0.06), p<0.001 (CI=0.685-0.901). A composite score of 0.456 had a sensitivity of 0.829 (correctly identifying schizophrenia patients) and a specificity of 0.703 (correctly identifying healthy controls). Results demonstrated the usefulness of combined statistical techniques in creating a multivariate composite that improves diagnostic accuracy.

  5. Revised Design-Based Research Methodology for College Course Improvement and Application to Education Courses in Japan

    Science.gov (United States)

    Akahori, Kanji

    2011-01-01

    The author describes a research methodology for college course improvement, and applies the results to education courses. In Japan, it is usually difficult to carry out research on college course improvement, because faculty cannot introduce experimental design approaches based on control and treatment groupings of students in actual classroom…

  6. THE ARTHRITIS AND MUSCULOSKELETAL QUALITY IMPROVEMENT PROGRAM (AMQUIP: A BREAKTHROUGH SERIES METHODOLOGY PROJECT

    Directory of Open Access Journals (Sweden)

    MASTURA I

    2008-01-01

    Full Text Available The Australian government had funded the National Primary Care Collaborative (NPCC program with funding of $14.6 million over three years. One of the pilots project was the Arthritis and Musculoskeletal Quality Improvement Program (AMQuIP.The study aims to optimize general practitioners (GPs management of patients with osteoarthritis (OA of the hip and knee by identifying gaps between their current practice and best practice. The Breakthrough Series Collaborative methodology with several Plan-Do-Study-Act (PDSA cycles was employed. Participants comprises of 12 GPs/practices from two Victorian Divisions of general Practice (one rural, one metropolitan with 10 patients per GP/practice. GPs/practices attended an orientation and three learning workshops and a videoconference. GPs/practices completed PDSA cycles between workshop and reported results at workshops. GPs/practices reported use of guidelines, change in patient management and change in practice management/systems. All recruited patients completed the SF-12v2 Health Survey and WOMAC OA Index Questionnaire twice. Follow up activities including focus groups and face-to-face interviews were held six months after the final workshop. All GPs/practices used the guidelines/key messages, introduced “new” management strategies to patients, and made positive changes to their practice management/systems. Patient reported positive changes and outcomes. By using a structured methodology and evidence-based guidelines/key messages; GPs can introduce new patient management strategies, and by identifying gaps in practice management systems, positive changes can be achieved.

  7. Methodology of heat transfer and flow resistance measurement for matrices of rotating regenerative heat exchangers

    Directory of Open Access Journals (Sweden)

    Butrymowicz Dariusz

    2016-09-01

    Full Text Available The theoretical basis for the indirect measurement approach of mean heat transfer coefficient for the packed bed based on the modified single blow technique was presented and discussed in the paper. The methodology of this measurement approach dedicated to the matrix of the rotating regenerative gas heater was discussed in detail. The testing stand consisted of a dedicated experimental tunnel with auxiliary equipment and a measurement system are presented. Selected experimental results are presented and discussed for selected types of matrices of regenerative air preheaters for the wide range of Reynolds number of gas. The agreement between the theoretically predicted and measured temperature profiles was demonstrated. The exemplary dimensionless relationships between Colburn heat transfer factor, Darcy flow resistance factor and Reynolds number were presented for the investigated matrices of the regenerative gas heater.

  8. Improvement in the incident reporting and investigation procedures using process excellence (DMAI2C) methodology.

    Science.gov (United States)

    Miles, Elizabeth N

    2006-03-17

    In 1996, Health & Safety introduced an incident investigation process called Learning to Look to Johnson & Johnson. This process provides a systematic way of analyzing work-related injuries and illness, uncovers root cause that leads to system defects, and points to viable solutions. The process analyzed involves three steps: investigation and reporting of the incident, determination of root cause, and development and implementation of a corrective action plan. The process requires the investigators to provide an initial communication for work-related serious injuries and illness as well as lost workday cases to Corporate Headquarters within 72 h of the incident with a full investigative report to follow within 10 days. A full investigation requires a written report, a cause-result logic diagram (CRLD), a corrective action plan (CAP) and a report of incident costs (SafeCost) all due to be filed electronically. It is incumbent on the principal investigator and his or her investigative teams to assemble the various parts of the investigation and to follow up with the relevant parties to ensure corrective actions are implemented, and a full report submitted to Corporate executives. Initial review of the system revealed that the process was not working as designed. A number of reports were late, not signed by the business leaders, and in some instances, all cause were not identified. Process excellence was the process used to study the issue. The team used six sigma DMAI2C methodologies to identify and implement system improvements. The project examined the breakdown of the critical aspects of the reporting and investigation process that lead to system errors. This report will discuss the study findings, recommended improvements, and methods used to monitor the new improved process.

  9. Does influenza vaccination improve pregnancy outcome? Methodological issues and research needs.

    Science.gov (United States)

    Savitz, David A; Fell, Deshayne B; Ortiz, Justin R; Bhat, Niranjan

    2015-11-25

    Evidence that influenza vaccination during pregnancy is safe and effective at preventing influenza disease in women and their children through the first months of life is increasing. Several reports of reduced risk of adverse outcomes associated with influenza vaccination have generated interest in its potential for improving pregnancy outcome. Gavi, the Vaccine Alliance, estimates maternal influenza immunization programs in low-income countries would have a relatively modest impact on mortality compared to other new or under-utilized vaccines, however the impact would be substantially greater if reported vaccine effects on improved pregnancy outcomes were accurate. Here, we examine the available evidence and methodological issues bearing on the relationship between influenza vaccination and pregnancy outcome, particularly preterm birth and fetal growth restriction, and summarize research needs. Evidence for absence of harm associated with vaccination at a point in time is not symmetric with evidence of benefit, given the scenario in which vaccination reduces risk of influenza disease and, in turn, risk of adverse pregnancy outcome. The empirical evidence for vaccination preventing influenza in pregnant women is strong, but the evidence that influenza itself causes adverse pregnancy outcomes is inconsistent and limited in quality. Studies of vaccination and pregnancy outcome have produced mixed evidence of potential benefit but are limited in terms of influenza disease assessment and control of confounding, and their analytic methods often fail to fully address the longitudinal nature of pregnancy and influenza prevalence. We recommend making full use of results of randomized trials, re-analysis of existing observational studies to account for confounding and time-related factors, and quantitative assessment of the potential benefits of vaccination in improving pregnancy outcome, all of which should be informed by the collective engagement of experts in influenza

  10. On Measuring the Criticality of Various Variables and Processes in Organization Information Systems: Proposed Methodological Procedure

    Directory of Open Access Journals (Sweden)

    Jagdish PATHAK

    2010-01-01

    Full Text Available This paper proposes methodological procedures to be used by the accounting, organizational and managerial researchers and executives to ascertain the criticality of the variables and the processes in the measurement of management control system. We have restricted the validation of proposed methods to the extraction of critical success factors (CSF in this study. We have also provided a numerical illustration and tested our methodological procedures using a dataset of an empirical study conducted for the purpose of ascertaining the CSFs. The proposed methods can be used by the researchers in accounting, organizational information systems, economics, and business and also in other relevant disciplines of organizational sciences. The main contribution of this paper is the extension of Rockart’s work [33] on critical success factors. We have extended the theory of CSF beyond the initially suggested domain of information into management control system decision making. The methodological procedures developed by us are expected to enrich the literature of analytical and empirical studies in accounting and organizational areas where it can prove helpful in understanding the criticality of individual variables, processes, methods or success factors.

  11. Aerosol classification using airborne High Spectral Resolution Lidar measurementsmethodology and examples

    Directory of Open Access Journals (Sweden)

    S. P. Burton

    2012-01-01

    Full Text Available The NASA Langley Research Center (LaRC airborne High Spectral Resolution Lidar (HSRL on the NASA B200 aircraft has acquired extensive datasets of aerosol extinction (532 nm, aerosol optical depth (AOD (532 nm, backscatter (532 and 1064 nm, and depolarization (532 and 1064 nm profiles during 18 field missions that have been conducted over North America since 2006. The lidar measurements of aerosol intensive parameters (lidar ratio, depolarization, backscatter color ratio, and spectral depolarization ratio are shown to vary with location and aerosol type. A methodology based on observations of known aerosol types is used to qualitatively classify the extensive set of HSRL aerosol measurements into eight separate types. Several examples are presented showing how the aerosol intensive parameters vary with aerosol type and how these aerosols are classified according to this new methodology. The HSRL-based classification reveals vertical variability of aerosol types during the NASA ARCTAS field experiment conducted over Alaska and northwest Canada during 2008. In two examples derived from flights conducted during ARCTAS, the HSRL classification of biomass burning smoke is shown to be consistent with aerosol types derived from coincident airborne in situ measurements of particle size and composition. The HSRL retrievals of AOD and inferences of aerosol types are used to apportion AOD to aerosol type; results of this analysis are shown for several experiments.

  12. Measuring attitudes in the self-employment intention model: methodological considerations

    Directory of Open Access Journals (Sweden)

    Josipa Mijoč

    2016-12-01

    Full Text Available The paper is based on a statistical model, the construction of which requires determining the independent variables. To determine the predictive ability of different approaches to measuring independent variables, the paper provides an overview of theoretical and research approaches to the research problem. The purpose of the study is to analyze the predictive power of instruments measuring attitudes toward self-employment as one of the most significant predictors of a career choice according to the theory of planned behavior. The paper juxtaposes two various measurement approaches in assessing attitudes toward self-employment. The first approach is based on behavioral beliefs that produce favorable or unfavorable attitudes toward a self-employed career and considers two opposing options: pursuing a self-employed career or accepting a job position (working for an employer. In this context, developing a measurement construct is a multistep process that requires testing psychometric characteristics of proposed measures based on predefined theoretical and empirical dimensions. The second approach incorporates aggregate measures of attitude toward self-employment in which the predictor variable is assessed from only one perspective, without taking into account other career options. Through the means of multiple regression analysis, the paper details a comparison of both measurement approaches and their performance in explaining the dependent variable (self-employment intention. The predictive power of the model is defined as a criterion for selecting a measurement approach that can serve as a methodological framework for prospective studies focused on investigating attitudes toward certain behavior.

  13. Dielectric Barrier Discharge (DBD) Plasma Actuators Thrust-Measurement Methodology Incorporating New Anti-Thrust Hypothesis

    Science.gov (United States)

    Ashpis, David E.; Laun, Matthew C.

    2014-01-01

    We discuss thrust measurements of Dielectric Barrier Discharge (DBD) plasma actuators devices used for aerodynamic active flow control. After a review of our experience with conventional thrust measurement and significant non-repeatability of the results, we devised a suspended actuator test setup, and now present a methodology of thrust measurements with decreased uncertainty. The methodology consists of frequency scans at constant voltages. The procedure consists of increasing the frequency in a step-wise fashion from several Hz to the maximum frequency of several kHz, followed by frequency decrease back down to the start frequency of several Hz. This sequence is performed first at the highest voltage of interest, then repeated at lower voltages. The data in the descending frequency direction is more consistent and selected for reporting. Sample results show strong dependence of thrust on humidity which also affects the consistency and fluctuations of the measurements. We also observed negative values of thrust or "anti-thrust", at low frequencies between 4 Hz and up to 64 Hz. The anti-thrust is proportional to the mean-squared voltage and is frequency independent. Departures from the parabolic anti-thrust curve are correlated with appearance of visible plasma discharges. We propose the anti-thrust hypothesis. It states that the measured thrust is a sum of plasma thrust and anti-thrust, and assumes that the anti-thrust exists at all frequencies and voltages. The anti-thrust depends on actuator geometry and materials and on the test installation. It enables the separation of the plasma thrust from the measured total thrust. This approach enables more meaningful comparisons between actuators at different installations and laboratories. The dependence on test installation was validated by surrounding the actuator with a large diameter, grounded, metal sleeve.

  14. Confirming, Classifying, and Prioritizing Needed Over-the-Bed Table Improvements via Methodological Triangulation.

    Science.gov (United States)

    Manganelli, Joe; Threatt, Anthony; Brooks, Johnell O; Healy, Stan; Merino, Jessica; Yanik, Paul; Walker, Ian; Green, Keith

    2014-01-01

    This article presents the results of a qualitative study that confirmed, classified, and prioritized user needs for the design of a more useful, usable, and actively assistive over-the-bed table. Manganelli et al. (2014) generated a list of 74 needs for use in developing an actively assistive over-the-bed table. This present study assesses the value and importance of those needs. Fourteen healthcare subject matter experts and eight research and design subject matter experts engaged in a participatory and iterative research and design process. A mixed methods qualitative approach used methodological triangulation to confirm the value of the findings and ratings to establish importance. Open and closed card sorts and a Delphi study were used. Data analysis methods included frequency analysis, content analysis, and a modified Kano analysis. A table demonstrating the needs that are of high importance to both groups of subject matter experts and classification of the design challenges each represents was produced. Through this process, the list of 74 needs was refined to the 37 most important need statements for both groups. Designing a more useful, usable, and actively assistive over-the-bed table is primarily about the ability to position it optimally with respect to the user for any task, as well as improving ease of use and usability. It is also important to make explicit and discuss the differences in priorities and perspectives demonstrated between research and design teams and their clients. © 2014 Vendome Group, LLC.

  15. Improvement of Bacillus thuringiensis bioinsecticide production by sporeless and sporulating strains using response surface methodology.

    Science.gov (United States)

    Ben Khedher, Saoussen; Kamoun, Amel; Jaoua, Samir; Zouari, Nabil

    2011-10-01

    Statistical experimental designs, involving a Plackett-Burman design followed by a rotatable central composite design were used to optimize the culture medium constituents for Bacillus thuringiensis bioinsecticide production. This was carried out by using firstly an asporogenic strain and extrapolated to some sporeless and sporulating strains. Initial screening of production parameters was performed and the variables with statistically significant effects on delta-endotoxin production were identified: glucose, glycerol, yeast extract and MnSO(4). These variables were selected for further optimization by response surface methodology. The obtained results revealed that the optimum culture medium for delta-endotoxin production consists of 22.5 g/l of glucose, 4.8g/l of glycerol, 5.8 g/l of yeast extract and 0.008 g/l of MnSO(4). Under these conditions, delta-endotoxin production was 2,130 and 2,260 mg/l into 250 and 1,000 ml flask respectively, which represent more than 38% improvement in toxin production over the basal medium (1,636 mg/l). Such medium composition was shown to be suitable for overproducing delta-endotoxins by sporeless and sporulating strains. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. POSSIBILITY OF IMPROVING EXISTING STANDARDS AND METHODOLOGIES FOR AUDITING INFORMATION SYSTEMS TO PROVIDE E-GOVERNMENT SERVICES

    Directory of Open Access Journals (Sweden)

    Евгений Геннадьевич Панкратов

    2014-03-01

    Full Text Available This article analyzes the existing methods of e-government systems audit, their shortcomings are examined.  The approaches to improve existing techniques and adapt them to the specific characteristics of e-government systems are suggested. The paper describes the methodology, providing possibilities of integrated assessment of information systems. This methodology uses systems maturity models and can be used in the construction of e-government rankings, as well as in the audit of their implementation process. Maturity models are based on COBIT, COSO methodologies and models of e-government, developed by the relevant committee of the UN. The methodology was tested during the audit of information systems involved in the payment of temporary disability benefits. The audit was carried out during analysis of the outcome of the pilot project for the abolition of the principle of crediting payments for disability benefits.DOI: http://dx.doi.org/10.12731/2218-7405-2014-2-5

  17. Improving Mathematics Performance among Secondary Students with EBD: A Methodological Review

    Science.gov (United States)

    Mulcahy, Candace A.; Krezmien, Michael P.; Travers, Jason

    2016-01-01

    In this methodological review, the authors apply special education research quality indicators and standards for single case design to analyze mathematics intervention studies for secondary students with emotional and behavioral disorders (EBD). A systematic methodological review of literature from 1975 to December 2012 yielded 19 articles that…

  18. Improving the Quality of Experience Journals: Training Educational Psychology Students in Basic Qualitative Methodology

    Science.gov (United States)

    Reynolds-Keefer, Laura

    2010-01-01

    This study evaluates the impact of teaching basic qualitative methodology to preservice teachers enrolled in an educational psychology course in the quality of observation journals. Preservice teachers enrolled in an educational psychology course requiring 45 hr of field experience were given qualitative methodological training as a part of the…

  19. Deception detection with behavioral, autonomic, and neural measures: Conceptual and methodological considerations that warrant modesty.

    Science.gov (United States)

    Meijer, Ewout H; Verschuere, Bruno; Gamer, Matthias; Merckelbach, Harald; Ben-Shakhar, Gershon

    2016-05-01

    The detection of deception has attracted increased attention among psychological researchers, legal scholars, and ethicists during the last decade. Much of this has been driven by the possibility of using neuroimaging techniques for lie detection. Yet, neuroimaging studies addressing deception detection are clouded by lack of conceptual clarity and a host of methodological problems that are not unique to neuroimaging. We review the various research paradigms and the dependent measures that have been adopted to study deception and its detection. In doing so, we differentiate between basic research designed to shed light on the neurocognitive mechanisms underlying deceptive behavior and applied research aimed at detecting lies. We also stress the distinction between paradigms attempting to detect deception directly and those attempting to establish involvement by detecting crime-related knowledge, and discuss the methodological difficulties and threats to validity associated with each paradigm. Our conclusion is that the main challenge of future research is to find paradigms that can isolate cognitive factors associated with deception, rather than the discovery of a unique (brain) correlate of lying. We argue that the Comparison Question Test currently applied in many countries has weak scientific validity, which cannot be remedied by using neuroimaging measures. Other paradigms are promising, but the absence of data from ecologically valid studies poses a challenge for legal admissibility of their outcomes.

  20. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  1. An integrated measurement and modeling methodology for estuarine water quality management

    Institute of Scientific and Technical Information of China (English)

    Michael Hartnett; Stephen Nash

    2015-01-01

    This paper describes research undertaken by the authors to develop an integrated measurement and modeling methodology for water quality management of estuaries. The approach developed utilizes modeling and measurement results in a synergistic manner. Modeling results were initially used to inform the field campaign of appropriate sampling locations and times, and field data were used to develop accurate models. Remote sensing techniques were used to capture data for both model development and model validation. Field surveys were undertaken to provide model initial conditions through data assimilation and determine nutrient fluxes into the model domain. From field data, salinity re-lationships were developed with various water quality parameters, and relationships between chlorophyll a concentrations, transparency, and light attenuation were also developed. These relationships proved to be invaluable in model development, particularly in modeling the growth and decay of chlorophyll a. Cork Harbour, an estuary that regularly experiences summer algal blooms due to anthropogenic sources of nutrients, was used as a case study to develop the methodology. The integration of remote sensing, conventional fieldwork, and modeling is one of the novel aspects of this research and the approach developed has widespread applicability.

  2. Improvements for Optics Measurement and Corrections software

    CERN Document Server

    Bach, T

    2013-01-01

    This note presents the improvements for the OMC software during a 14 month technical student internship at CERN. The goal of the work was to improve existing software in terms of maintainability, features and performance. Significant improvements in stability, speed and overall development process were reached. The main software, a Java GUI at the LHC CCC, run for months without noteworthy problems. The overall running time of the software chain used for optics corrections was reduced from nearly half an hour to around two minutes. This was the result of analysing and improving several involved programs and algorithms.

  3. Intercomparison of magnetic field measurements near MV/LV transformer substations: methodological planning and results.

    Science.gov (United States)

    Violanti, S; Fraschetta, M; Adda, S; Caputo, E

    2009-12-01

    Within the framework of Environmental Agencies system's activities, coordinated by ISPRA (superior institute for environmental protection and research), a comparison among measurements was designed and accomplished, in order to go into depth on the matter of measurement problems and to evaluate magnetic field at power frequencies. These measurements have been taken near medium voltage /low voltage transformer substation. This project was developed with the contribution of several experts who belong to different Regional Agencies. In three of these regions, substations having specific international standard characteristics were chosen; then a measurement and data analysis protocol was arranged. Data analysis showed a good level of coherence among results obtained by different laboratories. However, a range of problems emerged, either during the protocol predisposition and definition of the data analysis procedure or during the execution of measures and data reprocessing, because of the spatial and temporal variability of magnetic field. These problems represent elements of particular interest in determining a correct measurement methodology, whose purpose is the comparison with limits of exposure, attention values and quality targets.

  4. [Methodological issues in the measurement of alcohol consumption: the importance of drinking patterns].

    Science.gov (United States)

    Valencia Martín, José L; González, M José; Galán, Iñaki

    2014-08-01

    Measurement of alcohol consumption is essential for proper investigation of its effects on health. However, its estimation is extremely complex, because of the diversity of forms of alcohol consumption and their highly heterogeneous classification. Moreover, each form may have different effects on health; therefore, not considering the most important drinking patterns when estimating alcohol intake could mask the important role of consumption patterns in these effects. All these issues make it very difficult to compare the results of different studies and to establish consistent associations for understanding the true effects of alcohol consumption, both overall and specific to each drinking pattern. This article reviews the main methods and sources of information available in Spain for estimating the most important aspects of alcohol consumption, as well as the most frequent methodological problems encountered in the measurement and classification of drinking patterns.

  5. [Methodology and Implementation of Forced Oscillation Technique for Respiratory Mechanics Measurement].

    Science.gov (United States)

    Zhang, Zhengbo; Ni, Lu; Liu, Xiaoli; Li, Deyu; Wang, Weidong

    2015-11-01

    The forced oscillation technique (FOT) is a noninvasive method for respiratory mechanics measurement. For the FOT, external signals (e.g. forced oscillations around 4-40 Hz) are used to drive the respiratory system, and the mechanical characteristic of the respiratory system can be determined with the linear system identification theory. Thus, respiratory mechanical properties and components at different frequency and location of the airway can be explored by specifically developed forcing waveforms. In this paper, the theory, methodology and clinical application of the FOT is reviewed, including measure ment theory, driving signals, models of respiratory system, algorithm for impedance identification, and requirement on apparatus. Finally, the future development of this technique is also discussed.

  6. Improved measurement methods for railway rolling noise

    NARCIS (Netherlands)

    Dittrich, M.G.; Janssens, M.H.A.

    2000-01-01

    Some of the issues related to railway noise type testing are discussed and potential improvements to existing procedures are put forward. New and improved methods that also go beyond the scope of type testing are presented that help to characterize and analyze rolling noise more accurately. These

  7. Ultrasonic particle image velocimetry for improved flow gradient imaging: algorithms, methodology and validation.

    Science.gov (United States)

    Niu, Lili; Qian, Ming; Wan, Kun; Yu, Wentao; Jin, Qiaofeng; Ling, Tao; Gao, Shen; Zheng, Hairong

    2010-04-01

    This paper presents a new algorithm for ultrasonic particle image velocimetry (Echo PIV) for improving the flow velocity measurement accuracy and efficiency in regions with high velocity gradients. The conventional Echo PIV algorithm has been modified by incorporating a multiple iterative algorithm, sub-pixel method, filter and interpolation method, and spurious vector elimination algorithm. The new algorithms' performance is assessed by analyzing simulated images with known displacements, and ultrasonic B-mode images of in vitro laminar pipe flow, rotational flow and in vivo rat carotid arterial flow. Results of the simulated images show that the new algorithm produces much smaller bias from the known displacements. For laminar flow, the new algorithm results in 1.1% deviation from the analytically derived value, and 8.8% for the conventional algorithm. The vector quality evaluation for the rotational flow imaging shows that the new algorithm produces better velocity vectors. For in vivo rat carotid arterial flow imaging, the results from the new algorithm deviate 6.6% from the Doppler-measured peak velocities averagely compared to 15% of that from the conventional algorithm. The new Echo PIV algorithm is able to effectively improve the measurement accuracy in imaging flow fields with high velocity gradients.

  8. An Improved Methodology to Overcome Key Issues in Human Fecal Metagenomic DNA Extraction

    Directory of Open Access Journals (Sweden)

    Jitendra Kumar

    2016-12-01

    Full Text Available Microbes are ubiquitously distributed in nature, and recent culture-independent studies have highlighted the significance of gut microbiota in human health and disease. Fecal DNA is the primary source for the majority of human gut microbiome studies. However, further improvement is needed to obtain fecal metagenomic DNA with sufficient amount and good quality but low host genomic DNA contamination. In the current study, we demonstrate a quick, robust, unbiased, and cost-effective method for the isolation of high molecular weight (>23 kb metagenomic DNA (260/280 ratio >1.8 with a good yield (55.8 ± 3.8 ng/mg of feces. We also confirm that there is very low human genomic DNA contamination (eubacterial: human genomic DNA marker genes = 227.9:1 in the human feces. The newly-developed method robustly performs for fresh as well as stored fecal samples as demonstrated by 16S rRNA gene sequencing using 454 FLX+. Moreover, 16S rRNA gene analysis indicated that compared to other DNA extraction methods tested, the fecal metagenomic DNA isolated with current methodology retains species richness and does not show microbial diversity biases, which is further confirmed by qPCR with a known quantity of spike-in genomes. Overall, our data highlight a protocol with a balance between quality, amount, user-friendliness, and cost effectiveness for its suitability toward usage for culture-independent analysis of the human gut microbiome, which provides a robust solution to overcome key issues associated with fecal metagenomic DNA isolation in human gut microbiome studies.

  9. An Improved Methodology to Overcome Key Issues in Human Fecal Metagenomic DNA Extraction.

    Science.gov (United States)

    Kumar, Jitendra; Kumar, Manoj; Gupta, Shashank; Ahmed, Vasim; Bhambi, Manu; Pandey, Rajesh; Chauhan, Nar Singh

    2016-12-01

    Microbes are ubiquitously distributed in nature, and recent culture-independent studies have highlighted the significance of gut microbiota in human health and disease. Fecal DNA is the primary source for the majority of human gut microbiome studies. However, further improvement is needed to obtain fecal metagenomic DNA with sufficient amount and good quality but low host genomic DNA contamination. In the current study, we demonstrate a quick, robust, unbiased, and cost-effective method for the isolation of high molecular weight (>23kb) metagenomic DNA (260/280 ratio >1.8) with a good yield (55.8±3.8ng/mg of feces). We also confirm that there is very low human genomic DNA contamination (eubacterial: human genomic DNA marker genes=2(27.9):1) in the human feces. The newly-developed method robustly performs for fresh as well as stored fecal samples as demonstrated by 16S rRNA gene sequencing using 454 FLX+. Moreover, 16S rRNA gene analysis indicated that compared to other DNA extraction methods tested, the fecal metagenomic DNA isolated with current methodology retains species richness and does not show microbial diversity biases, which is further confirmed by qPCR with a known quantity of spike-in genomes. Overall, our data highlight a protocol with a balance between quality, amount, user-friendliness, and cost effectiveness for its suitability toward usage for culture-independent analysis of the human gut microbiome, which provides a robust solution to overcome key issues associated with fecal metagenomic DNA isolation in human gut microbiome studies. Copyright © 2016 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  10. Community-wide assessment of protein-interface modeling suggests improvements to design methodology.

    Science.gov (United States)

    Fleishman, Sarel J; Whitehead, Timothy A; Strauch, Eva-Maria; Corn, Jacob E; Qin, Sanbo; Zhou, Huan-Xiang; Mitchell, Julie C; Demerdash, Omar N A; Takeda-Shitaka, Mayuko; Terashi, Genki; Moal, Iain H; Li, Xiaofan; Bates, Paul A; Zacharias, Martin; Park, Hahnbeom; Ko, Jun-su; Lee, Hasup; Seok, Chaok; Bourquard, Thomas; Bernauer, Julie; Poupon, Anne; Azé, Jérôme; Soner, Seren; Ovali, Sefik Kerem; Ozbek, Pemra; Tal, Nir Ben; Haliloglu, Türkan; Hwang, Howook; Vreven, Thom; Pierce, Brian G; Weng, Zhiping; Pérez-Cano, Laura; Pons, Carles; Fernández-Recio, Juan; Jiang, Fan; Yang, Feng; Gong, Xinqi; Cao, Libin; Xu, Xianjin; Liu, Bin; Wang, Panwen; Li, Chunhua; Wang, Cunxin; Robert, Charles H; Guharoy, Mainak; Liu, Shiyong; Huang, Yangyu; Li, Lin; Guo, Dachuan; Chen, Ying; Xiao, Yi; London, Nir; Itzhaki, Zohar; Schueler-Furman, Ora; Inbar, Yuval; Potapov, Vladimir; Cohen, Mati; Schreiber, Gideon; Tsuchiya, Yuko; Kanamori, Eiji; Standley, Daron M; Nakamura, Haruki; Kinoshita, Kengo; Driggers, Camden M; Hall, Robert G; Morgan, Jessica L; Hsu, Victor L; Zhan, Jian; Yang, Yuedong; Zhou, Yaoqi; Kastritis, Panagiotis L; Bonvin, Alexandre M J J; Zhang, Weiyi; Camacho, Carlos J; Kilambi, Krishna P; Sircar, Aroop; Gray, Jeffrey J; Ohue, Masahito; Uchikoga, Nobuyuki; Matsuzaki, Yuri; Ishida, Takashi; Akiyama, Yutaka; Khashan, Raed; Bush, Stephen; Fouches, Denis; Tropsha, Alexander; Esquivel-Rodríguez, Juan; Kihara, Daisuke; Stranges, P Benjamin; Jacak, Ron; Kuhlman, Brian; Huang, Sheng-You; Zou, Xiaoqin; Wodak, Shoshana J; Janin, Joel; Baker, David

    2011-11-25

    The CAPRI (Critical Assessment of Predicted Interactions) and CASP (Critical Assessment of protein Structure Prediction) experiments have demonstrated the power of community-wide tests of methodology in assessing the current state of the art and spurring progress in the very challenging areas of protein docking and structure prediction. We sought to bring the power of community-wide experiments to bear on a very challenging protein design problem that provides a complementary but equally fundamental test of current understanding of protein-binding thermodynamics. We have generated a number of designed protein-protein interfaces with very favorable computed binding energies but which do not appear to be formed in experiments, suggesting that there may be important physical chemistry missing in the energy calculations. A total of 28 research groups took up the challenge of determining what is missing: we provided structures of 87 designed complexes and 120 naturally occurring complexes and asked participants to identify energetic contributions and/or structural features that distinguish between the two sets. The community found that electrostatics and solvation terms partially distinguish the designs from the natural complexes, largely due to the nonpolar character of the designed interactions. Beyond this polarity difference, the community found that the designed binding surfaces were, on average, structurally less embedded in the designed monomers, suggesting that backbone conformational rigidity at the designed surface is important for realization of the designed function. These results can be used to improve computational design strategies, but there is still much to be learned; for example, one designed complex, which does form in experiments, was classified by all metrics as a nonbinder.

  11. Using Lean Six Sigma Methodology to Improve a Mass Immunizations Process at the United States Naval Academy.

    Science.gov (United States)

    Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R

    2016-06-01

    Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely.

  12. Reduction of Complications of Local Anaesthesia in Dental Healthcare Setups by Application of the Six Sigma Methodology: A Statistical Quality Improvement Technique.

    Science.gov (United States)

    Akifuddin, Syed; Khatoon, Farheen

    2015-12-01

    Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.

  13. IMPROVING PSYCHOMOTRICITY COMPONENTS IN PRESCHOOL CHILDREN USING TEACHING METHODOLOGIES BASED ON MIRROR NEURONS ACTIVATION

    National Research Council Canada - National Science Library

    Gáll Zs. Sz; Balint L

    2015-01-01

    .... Using this as a starting point, the study aims to work out and apply a methodology in keeping with the content of the psychomotor expression activities curriculum for preschool education, resorting...

  14. Measurement methodology of natural radioactivity in the thermal establishments; Methodologies de mesure de la radioactivite naturelle dans les etablissements thermaux

    Energy Technology Data Exchange (ETDEWEB)

    Ameon, R.; Robe, M.C

    2004-11-15

    The thermal baths have been identified as an activity susceptible to expose to ionizing radiations the workers through the natural sources of radon and radon 220. The new regulation obliges these facilities to realize radioactivity measurements. The principal ways of exposure are radon and its daughters inhalation,, exposure to gamma radiation, ingestion of radioelements in thermal waters. I.R.S.N. proposes two methods of measurements of the natural radioactivity in application to the regulation relative to the protection of persons and workers. Some principles to reduce exposure to radon are reminded. (N.C.)0.

  15. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  16. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  17. A Methodology to Integrate Magnetic Resonance and Acoustic Measurements for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Parra, Jorge O.; Hackert, Chris L.; Collier, Hughbert A.; Bennett, Michael

    2002-01-29

    The objective of this project was to develop an advanced imaging method, including pore scale imaging, to integrate NMR techniques and acoustic measurements to improve predictability of the pay zone in hydrocarbon reservoirs. This is accomplished by extracting the fluid property parameters using NMR laboratory measurements and the elastic parameters of the rock matrix from acoustic measurements to create poroelastic models of different parts of the reservoir. Laboratory measurement techniques and core imaging are being linked with a balanced petrographical analysis of the core and theoretical model.

  18. A Methodology to Integrate Magnetic Resonance and Acoustic Measurements for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Parra, Ph.D., Jorge O.

    2002-06-10

    The objective of the project was to develop an advanced imaging method, including pore scale imaging, to integrate nuclear magnetic resonance (NMR) techniques and acoustic measurements to improve predictability of the pay zone in hydrocarbon reservoirs. This will be accomplished by extracting the fluid property parameters using NMR laboratory measurements and the elastic parameters of the rock matrix from acoustic measurements to create poroelastic models of different parts of the reservoir. Laboratory measurement techniques and core imaging were linked with a balanced petrographical analysis of cores and theoretical modeling.

  19. A Methodology to Integrate Magnetic Resonance and Acoustic Measurements for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Parra, Ph.D., Jorge O.

    2002-06-10

    The objective of the project was to develop an advanced imaging method, including pore scale imaging, to integrate nuclear magnetic resonance (NMR) techniques and acoustic measurements to improve predictability of the pay zone in hydrocarbon reservoirs. This will be accomplished by extracting the fluid property parameters using NMR laboratory measurements and the elastic parameters of the rock matrix from acoustic measurements to create poroelastic models of different parts of the reservoir. Laboratory measurement techniques and core imaging were linked with a balanced petrographical analysis of cores and theoretical modeling.

  20. Improving quality in population surveys of headache prevalence, burden and cost: key methodological considerations.

    Science.gov (United States)

    Steiner, Timothy J; Stovner, Lars Jacob; Al Jumah, Mohammed; Birbeck, Gretchen L; Gururaj, Gopalakrishna; Jensen, Rigmor; Katsarava, Zaza; Queiroz, Luiz Paulo; Scher, Ann I; Tekle-Haimanot, Redda; Wang, Shuu-Jiun; Martelletti, Paolo; Dua, Tarun; Chatterji, Somnath

    2013-01-01

    Population-based studies of headache disorders are important. They inform needs assessment and underpin service policy for a set of disorders that are a public-health priority. On the one hand, our knowledge of the global burden of headache is incomplete, with major geographical gaps; on the other, methodological differences and variable quality are notable among published studies of headache prevalence, burden and cost. The purpose here was to start the process of developing standardized and better methodology in these studies. An expert consensus group was assembled to identify the key methodological issues, and areas where studies might fail. Members had competence and practical experience in headache epidemiology or epidemiology in general, and were drawn from all WHO world regions. We reviewed the relevant literature, and supplemented the knowledge gathered from this exercise with experience gained from recent Global Campaign population-based studies, not all yet published. We extracted methodological themes and identified issues within them that were of key importance. We found wide variations in methodology. The themes within which methodological shortcomings had adverse impact on quality were the following: study design; selection and/or definition of population of interest; sampling and bias avoidance; sample size estimation; access to selected subjects (managing and reporting non-participation); case definition (including diagnosis and timeframe); case ascertainment (including diagnostic validation of questionnaires); burden estimation; reporting (methods and results). These are discussed.

  1. A comparison methodology for measured and predicted displacement fields in modal analysis

    Science.gov (United States)

    Sebastian, C. M.; López-Alba, E.; Patterson, E. A.

    2017-07-01

    Recent advances in experimental mechanics have enabled full-field measurements of deformation fields and - particularly in the field of solid mechanics - methodologies have been proposed for utilizing these fields in the validation of computational models. However, the comparison of modal shapes and the path from the undeformed shape to the deformed shape at the extreme of a vibration cycle is not straightforward. Therefore a new method to compare vibration data from experiment to simulations is presented which uses full-field experimental data from the entire cycle of vibration. Here, the first three modes of vibration of an aerospace panel were compared, covering a frequency range of 14-59 Hz and maximum out-of-plane displacements of 2 mm. Two different comparison methodologies are considered; the first is the use of confidence bands, previously explored for quasi-static loading, the second is the use of a concordance correlation coefficient, which provides quantifiable information about the validity of the simulation. In addition, three different simulation conditions were considered, representing a systematic refinement of the model. It was found that meaningful conclusions can be drawn about the simulation by comparing individual components of deformation from the image decomposition process, such as the relative phase and magnitude. It was ultimately found that the best performing model did not entirely fall within the confidence bounds for all conditions, but returned a concordance correlation coefficient of nearly 70% for all three modes.

  2. Improving on daily measures of price discovery

    DEFF Research Database (Denmark)

    Dias, Gustavo Fruet; Fernandes, Marcelo; Scherrer, Cristina

    We formulate a continuous-time price discovery model in which the price discovery measure varies (stochastically) at daily frequency. We estimate daily measures of price discovery using a kernel-based OLS estimator instead of running separate daily VECM regressions as standard in the literature. ...

  3. Health Data Entanglement and artificial intelligence-based analysis: a brand new methodology to improve the effectiveness of healthcare services.

    Science.gov (United States)

    Capone, A; Cicchetti, A; Mennini, F S; Marcellusi, A; Baio, G; Favato, G

    2016-01-01

    Healthcare expenses will be the most relevant policy issue for most governments in the EU and in the USA. This expenditure can be associated with two major key categories: demographic and economic drivers. Factors driving healthcare expenditure were rarely recognised, measured and comprehended. An improvement of health data generation and analysis is mandatory, and in order to tackle healthcare spending growth, it may be useful to design and implement an effective, advanced system to generate and analyse these data. A methodological approach relied upon the Health Data Entanglement (HDE) can be a suitable option. By definition, in the HDE a large amount of data sets having several sources are functionally interconnected and computed through learning machines that generate patterns of highly probable future health conditions of a population. Entanglement concept is borrowed from quantum physics and means that multiple particles (information) are linked together in a way such that the measurement of one particle's quantum state (individual health conditions and related economic requirements) determines the possible quantum states of other particles (population health forecasts to predict their impact). The value created by the HDE is based on the combined evaluation of clinical, economic and social effects generated by health interventions. To predict the future health conditions of a population, analyses of data are performed using self-learning AI, in which sequential decisions are based on Bayesian algorithmic probabilities. HDE and AI-based analysis can be adopted to improve the effectiveness of the health governance system in ways that also lead to better quality of care.

  4. Measuring domestic water use: a systematic review of methodologies that measure unmetered water use in low-income settings.

    Science.gov (United States)

    Tamason, Charlotte C; Bessias, Sophia; Villada, Adriana; Tulsiani, Suhella M; Ensink, Jeroen H J; Gurley, Emily S; Mackie Jensen, Peter Kjaer

    2016-11-01

    To present a systematic review of methods for measuring domestic water use in settings where water meters cannot be used. We systematically searched EMBASE, PubMed, Water Intelligence Online, Water Engineering and Development Center, IEEExplore, Scielo, and Science Direct databases for articles that reported methodologies for measuring water use at the household level where water metering infrastructure was absent or incomplete. A narrative review explored similarities and differences between the included studies and provide recommendations for future research in water use. A total of 21 studies were included in the review. Methods ranged from single-day to 14-consecutive-day visits, and water use recall ranged from 12 h to 7 days. Data were collected using questionnaires, observations or both. Many studies only collected information on water that was carried into the household, and some failed to mention whether water was used outside the home. Water use in the selected studies was found to range from two to 113 l per capita per day. No standardised methods for measuring unmetered water use were found, which brings into question the validity and comparability of studies that have measured unmetered water use. In future studies, it will be essential to define all components that make up water use and determine how they will be measured. A pre-study that involves observations and direct measurements during water collection periods (these will have to be determined through questioning) should be used to determine optimal methods for obtaining water use information in a survey. Day-to-day and seasonal variation should be included. A study that investigates water use recall is warranted to further develop standardised methods to measure water use; in the meantime, water use recall should be limited to 24 h or fewer. © 2016 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  5. A methodology for the measure of secondary homes tourist flows at municipal level

    Directory of Open Access Journals (Sweden)

    Andrea Guizzardi

    2007-10-01

    Full Text Available The present public statistical system does not provide information concerning second houses touristic flows at sub-regional level. The lack limits local administrations' capabilities to take decisions about either: environmental, territorial and productive development, as well as regional governments in fair allocation of public financing. In the work, this information lack is overcome by proposing an indirect estimation methodology. Municipalities electric power consumption is proposed as an indicator of the stays on secondary homes. The indicator is connected to tourism flows considering both measurement errors and factors, modifying the local power demand. The application to Emilia-Romagna regional case allow to verify results’ coherence with officials statistics, as weel as to assess municipalities’ tourist vocation.

  6. Development and Attestation of Gamma-Ray Measurement Methodologies for use by Rostekhnadzor Inspectors in the Russian Federation

    Energy Technology Data Exchange (ETDEWEB)

    Jeff Sanders

    2006-09-01

    Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revision of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation.

  7. Forecasting future oil production in Norway and the UK: a general improved methodology

    CERN Document Server

    Fievet, Lucas; Cauwels, Peter; Sornette, Didier

    2014-01-01

    We present a new Monte-Carlo methodology to forecast the crude oil production of Norway and the U.K. based on a two-step process, (i) the nonlinear extrapolation of the current/past performances of individual oil fields and (ii) a stochastic model of the frequency of future oil field discoveries. Compared with the standard methodology that tends to underestimate remaining oil reserves, our method gives a better description of future oil production, as validated by our back-tests starting in 2008. Specifically, we predict remaining reserves extractable until 2030 to be 188 +/- 10 million barrels for Norway and 98 +/- 10 million barrels for the UK, which are respectively 45% and 66% above the predictions using the standard methodology.

  8. Improving Outcome Measures Other Than Achievement

    Directory of Open Access Journals (Sweden)

    Kristin Anderson Moore

    2015-05-01

    Full Text Available Research indicates that educational, economic, and life success reflect children’s nonacademic as well as academic competencies. Therefore, longitudinal surveys that assess educational progress and success need to incorporate nonacademic measures to avoid omitted variable bias, inform development of new intervention strategies, and support mediating and moderating analyses. Based on a life course model and a whole child perspective, this article suggests constructs in the domains of child health, emotional/psychological development, educational achievement/attainment, social behavior, and social relationships. Four critical constructs are highlighted: self-regulation, agency/motivation, persistence/diligence, and executive functioning. Other constructs that are currently measured need to be retained, including social skills, positive relationships, activities, positive behaviors, academic self-efficacy, educational engagement, and internalizing/emotional well-being. Examples of measures that are substantively and psychometrically robust are provided.

  9. Improving competitiveness through performance-measurement systems.

    Science.gov (United States)

    Stewart, L J; Lockamy, A

    2001-12-01

    Parallels exist between the competitive pressures felt by U.S. manufacturers over the past 30 years and those experienced by healthcare providers today. Increasing market deregulation, changing government policies, and growing consumerism have altered the healthcare arena. Responding to similar pressures, manufacturers adopted a strategic orientation driven by customer needs and expectations that led them to achieve high performance levels and surpass their competition. The adoption of integrated performance-measurement systems was instrumental in these firms' success. An integrated performance-measurement model for healthcare organizations can help to blend the organization's strategy with the demands of the contemporary healthcare environment. Performance-measurement systems encourage healthcare organizations to focus on their mission and vision by aligning their strategic objectives and resource-allocation decisions with customer requirements.

  10. Characteristics measurement methodology of the large-size autostereoscopic 3D LED display

    Science.gov (United States)

    An, Pengli; Su, Ping; Zhang, Changjie; Cao, Cong; Ma, Jianshe; Cao, Liangcai; Jin, Guofan

    2014-11-01

    Large-size autostereoscopic 3D LED displays are commonly used in outdoor or large indoor space, and have the properties of long viewing distance and relatively low light intensity at the viewing distance. The instruments used to measure the characteristics (crosstalk, inconsistency, chromatic dispersion, etc.) of the displays should have long working distance and high sensitivity. In this paper, we propose a methodology for characteristics measurement based on a distribution photometer with a working distance of 5.76m and the illumination sensitivity of 0.001 mlx. A display panel holder is fabricated and attached on the turning stage of the distribution photometer. Specific test images are loaded on the display separately, and the luminance data at the distance of 5.76m to the panel are measured. Then the data are transformed into the light intensity at the optimum viewing distance. According to definitions of the characteristics of the 3D displays, the crosstalk, inconsistency, chromatic dispersion could be calculated. The test results and analysis of the characteristics of an autostereoscopic 3D LED display are proposed.

  11. Moving to Capture Children's Attention: Developing a Methodology for Measuring Visuomotor Attention.

    Directory of Open Access Journals (Sweden)

    Liam J B Hill

    Full Text Available Attention underpins many activities integral to a child's development. However, methodological limitations currently make large-scale assessment of children's attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of 'Visual Motor Attention' (VMA-a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method's core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus and demonstrated its sensitivity to principled manipulations in adults' attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action.

  12. Moving to Capture Children's Attention: Developing a Methodology for Measuring Visuomotor Attention.

    Science.gov (United States)

    Hill, Liam J B; Coats, Rachel O; Mushtaq, Faisal; Williams, Justin H G; Aucott, Lorna S; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child's development. However, methodological limitations currently make large-scale assessment of children's attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of 'Visual Motor Attention' (VMA)-a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method's core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults' attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action).

  13. An improved device to measure cottonseed strength

    Science.gov (United States)

    During processing, seeds of cotton cultivars with fragile seeds often break and produce seed coat fragments that can cause processing problems at textile mills. A cottonseed shear tester, previously developed to measure cottonseed strength, was modified with enhancements to the drive system to provi...

  14. Novel methodology for accurate resolution of fluid signatures from multi-dimensional NMR well-logging measurements

    Science.gov (United States)

    Anand, Vivek

    2017-03-01

    A novel methodology for accurate fluid characterization from multi-dimensional nuclear magnetic resonance (NMR) well-logging measurements is introduced. This methodology overcomes a fundamental challenge of poor resolution of features in multi-dimensional NMR distributions due to low signal-to-noise ratio (SNR) of well-logging measurements. Based on an unsupervised machine-learning concept of blind source separation, the methodology resolves fluid responses from simultaneous analysis of large quantities of well-logging data. The multi-dimensional NMR distributions from a well log are arranged in a database matrix that is expressed as the product of two non-negative matrices. The first matrix contains the unique fluid signatures, and the second matrix contains the relative contributions of the signatures for each measurement sample. No a priori information or subjective assumptions about the underlying features in the data are required. Furthermore, the dimensionality of the data is reduced by several orders of magnitude, which greatly simplifies the visualization and interpretation of the fluid signatures. Compared to traditional methods of NMR fluid characterization which only use the information content of a single measurement, the new methodology uses the orders-of-magnitude higher information content of the entire well log. Simulations show that the methodology can resolve accurate fluid responses in challenging SNR conditions. The application of the methodology to well-logging data from a heavy oil reservoir shows that individual fluid signatures of heavy oil, water associated with clays and water in interstitial pores can be accurately obtained.

  15. Improved methodologies for continuous-flow analysis of stable water isotopes in ice cores

    Science.gov (United States)

    Jones, Tyler R.; White, James W. C.; Steig, Eric J.; Vaughn, Bruce H.; Morris, Valerie; Gkinis, Vasileios; Markle, Bradley R.; Schoenemann, Spruce W.

    2017-02-01

    Water isotopes in ice cores are used as a climate proxy for local temperature and regional atmospheric circulation as well as evaporative conditions in moisture source regions. Traditional measurements of water isotopes have been achieved using magnetic sector isotope ratio mass spectrometry (IRMS). However, a number of recent studies have shown that laser absorption spectrometry (LAS) performs as well or better than IRMS. The new LAS technology has been combined with continuous-flow analysis (CFA) to improve data density and sample throughput in numerous prior ice coring projects. Here, we present a comparable semi-automated LAS-CFA system for measuring high-resolution water isotopes of ice cores. We outline new methods for partitioning both system precision and mixing length into liquid and vapor components - useful measures for defining and improving the overall performance of the system. Critically, these methods take into account the uncertainty of depth registration that is not present in IRMS nor fully accounted for in other CFA studies. These analyses are achieved using samples from a South Pole firn core, a Greenland ice core, and the West Antarctic Ice Sheet (WAIS) Divide ice core. The measurement system utilizes a 16-position carousel contained in a freezer to consecutively deliver ˜ 1 m × 1.3 cm2 ice sticks to a temperature-controlled melt head, where the ice is converted to a continuous liquid stream and eventually vaporized using a concentric nebulizer for isotopic analysis. An integrated delivery system for water isotope standards is used for calibration to the Vienna Standard Mean Ocean Water (VSMOW) scale, and depth registration is achieved using a precise overhead laser distance device with an uncertainty of ±0.2 mm. As an added check on the system, we perform inter-lab LAS comparisons using WAIS Divide ice samples, a corroboratory step not taken in prior CFA studies. The overall results are important for substantiating data obtained from LAS

  16. An Improved PLL for Tune Measurements

    CERN Document Server

    Berrig, O

    2003-01-01

    The key element determining the dynamic performance of such a PLL is the phase detector between the beam oscillation and the internal oscillation. Most circuits use a quadrature phase detector, for which the high frequency carrier at twice the excitation frequency is attenuated by a low-pass circuit. The remaining ripple of this component contributes to the bandwidth/noise performance of the PLL. In this paper we propose an alternative solution for the filter, notably an adaptive notch filter. We explain in detail design considerations and the resulting improvements in PLL bandwidth and/or noise figure.

  17. Improving Emergency Department Door to Doctor Time and Process Reliability: A Successful Implementation of Lean Methodology.

    Science.gov (United States)

    El Sayed, Mazen J; El-Eid, Ghada R; Saliba, Miriam; Jabbour, Rima; Hitti, Eveline A

    2015-10-01

    The aim of this study is to determine the effectiveness of using lean management methods on improving emergency department door to doctor times at a tertiary care hospital.We performed a before and after study at an academic urban emergency department with 49,000 annual visits after implementing a series of lean driven interventions over a 20 month period. The primary outcome was mean door to doctor time and the secondary outcome was length of stay of both admitted and discharged patients. A convenience sample from the preintervention phase (February 2012) was compared to another from the postintervention phase (mid-October to mid-November 2013). Individual control charts were used to assess process stability.Postintervention there was a statistically significant decrease in the mean door to doctor time measure (40.0 minutes ± 53.44 vs 25.3 minutes ± 15.93 P < 0.001). The postintervention process was more statistically in control with a drop in the upper control limits from 148.8 to 72.9 minutes. Length of stay of both admitted and discharged patients dropped from 2.6 to 2.0 hours and 9.0 to 5.5 hours, respectively. All other variables including emergency department visit daily volumes, hospital occupancy, and left without being seen rates were comparable.Using lean change management techniques can be effective in reducing door to doctor time in the Emergency Department and improving process reliability.

  18. Development of methodology for measurements of residual stresses in welded joint based on displacement of points in a coordinated table

    Directory of Open Access Journals (Sweden)

    Aníbal Veras Siqueira Filho

    2013-04-01

    Full Text Available Residual stresses in a welded joint of ASTM A131 grade AH32 steel was measured either by the X-ray diffraction or by displacements of referenced points measured on a coordinate measuring machine before and after heat treatment. For all tests, the welding was performed with Shielded Metal Arc Welding, vertical-up position, by a certified welder. After welding, some specimens were subjected to marking, made through small holes evenly spaced and mapped on a coordinate measuring machine. After labeling, the samples were subjected to heat treatment at temperatures nearby recrystallization. After heat treatment, the samples were subjected to new measurements by coordinate measuring machine to evaluate the displacements of the points produced by the recrystallization. In parallel, residual stress measurements were made by XRD for validation of this new methodology. The results obtained either by X-ray or by coordinate measuring machine showed a good correlation between the two measurement methodologies employed.

  19. Evaluation Methodology for Surface Engineering Techniques to Improve Powertrain Efficiency in Military Vehicles

    Science.gov (United States)

    2012-06-01

    efficiency within military vehicle drivetrains . This report details the experimental methodology developed by the U.S. Army Research Laboratory to...experiments are conducted on a subsystem component of a vehicle drivetrain . A parallel basic research thrust includes computational modeling of...of research efforts at the basic and applied research level to advance theoretical and practical understanding of drivetrain component efficiencies

  20. METHODOLOGICAL UNCERTAINTIES OF A NEW PROFESSIONAL TEACHER’S STANDARD AND PROPOSALS FOR ITS IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Victor G. Gorb

    2015-01-01

    Full Text Available The aim of the investigation is to determine the ways of overcoming methodological uncertainties included into the teacher’s (tutor’s professional standards. Methods. The system-activity approach to the teacher’s activity development was used by the author. Results. The author has developed (on the basis of the system-activity approach the structure and content of the teacher’s personnel administration plan that allows realizing demands of the teacher’s (tutor’s professional standards and solving its main methodological uncertainties. Scientific novelty. The author presents own systematic and activity-based methodology to the development of personnel administration plan for the educational sphere personnel in order to enhance pedagogical potential of educational activity, and create organizational arrangements for its effectiveness, quality and social efficiency. Practical significance. The proposed system-activity methodology can be used under the modernization of personnel administration plans for teachers within the context of the teacher’s professional standard realization. 

  1. Improvement an enterprises marketing performance measurement system

    Directory of Open Access Journals (Sweden)

    Stanković Ljiljana

    2013-01-01

    Full Text Available Business conditions in which modern enterprises do business are more and more complex. The complexity of the business environment is caused by activities of external and internal factors, which imposes the need for the turn in management focus. One of key turns is related to the need of adaptation and development of new business performance evaluation systems. The evaluation of marketing contribution to business performance is very important however a complex task as well. The marketing theory and practice indicates the need for developing adequate standards and systems for evaluating the efficiency of marketing decisions. The better understanding of marketing standards and ways that managers use is a very important factor that affects the efficiency of strategic decision-making. The paper presents the results of researching the way in which managers perceive and apply marketing performance measures. The data that were received through the field research sample enabled the consideration of the managers' attitudes on practical ways of implementing marketing performance measurement and identifying measures that managers imply as used mostly in business practice.

  2. [Impact of Lean methodology to improve care processes and levels of satisfaction in patient care in a clinical laboratory].

    Science.gov (United States)

    Morón-Castañeda, L H; Useche-Bernal, A; Morales-Reyes, O L; Mojica-Figueroa, I L; Palacios-Carlos, A; Ardila-Gómez, C E; Parra-Ardila, M V; Martínez-Nieto, O; Sarmiento-Echeverri, N; Rodríguez, C A; Alvarado-Heine, C; Isaza-Ruget, M A

    2015-01-01

    The application of the Lean methodology in health institutions is an effective tool to improve the capacity and workflow, as well as to increase the level of satisfaction of patients and employees. To optimise the time of outpatient care in a clinical laboratory, by implementing a methodology based on the organisation of operational procedures to improve user satisfaction and reduce the number of complaints for delays in care. A quasi-experimental before and after study was conducted between October 2011 to September 2012. XBar and S charts were used to observe the mean service times and standard deviation. The user satisfaction was assessed using service questionnaires. A reduction of 17 minutes was observed in the time of patient care from arrival to leaving the laboratory, and a decrease of 60% in complaints of delay in care. Despite the high staff turnover and 38% increase in the number of patients seen, a culture of empowerment and continuous improvement was acquired, as well as greater efficiency and productivity in the care process, which was reflected by maintaining standards 12 months after implementation. Lean is a viable methodology for clinical laboratory procedures, improving their efficiency and effectiveness. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  3. P-band Radar Retrieval of Root-Zone Soil Moisture: AirMOSS Methodology, Progress, and Improvements

    Science.gov (United States)

    Moghaddam, M.; Tabatabaeenejad, A.; Chen, R.

    2015-12-01

    The AirMOSS mission seeks to improve the estimates of the North American Net Ecosystem Exchange (NEE)by providing high-resolution observations of the root zone soil moisture (RZSM) over regions representative of themajor North American biomes. The radar snapshots are used to generate estimates of RZSM. To retrieve RZSM, weuse a discrete scattering model integrated with layered-soil scattering models. The soil moisture profile is representedas a quadratic function in the form of az2 + bz + c, where z is the depth and a, b, and c are the coefficients to beretrieved. The ancillary data necessary to characterize a pixel are available from various databases. We applythe retrieval method to the radar data acquired over AirMOSS sites including Canada's BERMS, Walnut Gulchin Arizona, MOISST in Oklahoma, Tonzi Ranch in California, and Metolius in Oregon, USA. The estimated soilmoisture profile is validated against in-situ soil moisture measurements. We have continued to improve the accuracyof retrievals as the delivery of the RZSMproducts has progressed since 2012. For example, the 'threshold depth' (thedepth up to which the retrieval is mathematically valid) has been reduced from 100 cm to 50 cm after the retrievalaccuracy was assessed both mathematically and physically. Moreover, we progressively change the implementationof the inversion code and its subroutines as we find more accurate and efficient ways of mathematical operations. Thelatest AirMOSS results (including soil moisture maps, validation plots, and scatter plots) as well as all improvementsapplied to the retrieval algorithm, including the one mentioned above, will be reported at the talk, following a briefdescription of the retrieval methodology. Fig. 1 shows a validation plot for a flight over Tonzi Ranch from September2014 (a) and a scatter plot for various threshold depths using 2012 and 2013 data.

  4. Site-conditions map for Portugal based on VS measurements: methodology and final model

    Science.gov (United States)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and

  5. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  6. Extremely low frequency electromagnetic field measurements at the Hylaty station and methodology of signal analysis

    Science.gov (United States)

    Kulak, Andrzej; Kubisz, Jerzy; Klucjasz, Slawomir; Michalec, Adam; Mlynarczyk, Janusz; Nieckarz, Zenon; Ostrowski, Michal; Zieba, Stanislaw

    2014-06-01

    We present the Hylaty geophysical station, a high-sensitivity and low-noise facility for extremely low frequency (ELF, 0.03-300 Hz) electromagnetic field measurements, which enables a variety of geophysical and climatological research related to atmospheric, ionospheric, magnetospheric, and space weather physics. The first systematic observations of ELF electromagnetic fields at the Jagiellonian University were undertaken in 1994. At the beginning the measurements were carried out sporadically, during expeditions to sparsely populated areas of the Bieszczady Mountains in the southeast of Poland. In 2004, an automatic Hylaty ELF station was built there, in a very low electromagnetic noise environment, which enabled continuous recording of the magnetic field components of the ELF electromagnetic field in the frequency range below 60 Hz. In 2013, after 8 years of successful operation, the station was upgraded by extending its frequency range up to 300 Hz. In this paper we show the station's technical setup, and how it has changed over the years. We discuss the design of ELF equipment, including antennas, receivers, the time control circuit, and power supply, as well as antenna and receiver calibration. We also discuss the methodology we developed for observations of the Schumann resonance and wideband observations of ELF field pulses. We provide examples of various kinds of signals recorded at the station.

  7. Enabling Mobile Communications for the Needy: Affordability Methodology, and Approaches to Requalify Universal Service Measures

    Directory of Open Access Journals (Sweden)

    Louis-Francois PAU

    2009-01-01

    Full Text Available This paper links communications and media usage to social and household economics boundaries. It highlights that in present day society, communications and media are a necessity, but not always affordable, and that they furthermore open up for addictive behaviors which raise additional financial and social risks. A simple and efficient methodology compatible with state-of-the-art social and communications business statistics is developed, which produces the residual communications and media affordability budget and ultimately the value-at-risk in terms of usage and tariffs. Sensitivity analysis provides precious information on bottom-up communications and media adoption on the basis of affordability. This approach differs from the regulated but often ineffective Universal service obligation, which instead of catering for individual needs mostly addresses macro-measures helping geographical access coverage (e.g. in rural areas. It is proposed to requalify the Universal service obligations on operators into concrete measures, allowing, with unchanged funding, the needy to adopt mobile services based on their affordability constraints by bridging the gap to a standard tariff. Case data are surveyed from various countries. ICT policy recommendations are made to support widespread and socially responsible communications access.

  8. Measuring Effectiveness in Digital Game-Based Learning: A Methodological Review.

    Directory of Open Access Journals (Sweden)

    Anissa All

    2014-06-01

    Full Text Available In recent years, a growing number of studies are being conducted into the effectiveness of digital game-based learning (DGBL. Despite this growing interest, there is a lack of sound empirical evidence on the effectiveness of DGBL due to different outcome measures for assessing effectiveness, varying methods of data collection and inconclusive or difficult to interpret results. This has resulted in a need for an overarching methodology for assessing the effectiveness of DGBL. The present study took a first step in this direction by mapping current methods used for assessing the effectiveness of DGBL. Results showed that currently, comparison of results across studies and thus looking at effectiveness of DGBL on a more general level is problematic due to diversity in and suboptimal study designs. Variety in study design relates to three issues, namely different activities that are implemented in the control groups, different measures for assessing the effectiveness of DGBL and the use of different statistical techniques for analyzing learning outcomes. Suboptimal study designs are the result of variables confounding study results. Possible confounds that were brought forward in this review are elements that are added to the game as part of the educational intervention (e.g., required reading, debriefing session, instructor influences and practice effects when using the same test pre- and post-intervention. Lastly, incomplete information on the study design impedes replication of studies and thus falsification of study results.

  9. Analysis and methodology for measuring oxygen concentration in liquid sodium with a plugging meter

    Energy Technology Data Exchange (ETDEWEB)

    Nollet, B. K.; Hvasta, M.; Anderson, M. [Univ. of Wisconsin-Madison, 1500 Engineering Dr., Madison, WI 53706 (United States)

    2012-07-01

    Oxygen concentration in liquid sodium is a critical measurement in assessing the potential for corrosion damage in sodium-cooled fast reactors (SFRs). There has been little recent work on sodium reactors and oxygen detection. Thus, the technical expertise dealing with oxygen measurements within sodium is no longer readily available in the U.S. Two methods of oxygen detection that have been investigated are the plugging meter and the galvanic cell. One of the overall goals of the Univ. of Wisconsin's sodium research program is to develop an affordable, reliable galvanic cell oxygen sensor. Accordingly, attention must first be dedicated to a well-known standard known as a plugging meter. Therefore, a sodium loop has been constructed on campus in effort to develop the plugging meter technique and gain experience working with liquid metal. The loop contains both a galvanic cell test section and a plugging meter test section. Consistent plugging results have been achieved below 20 [wppm], and a detailed process for achieving effective plugging has been developed. This paper will focus both on an accurate methodology to obtain oxygen concentrations from a plugging meter, and on how to easily control the oxygen concentration of sodium in a test loop. Details of the design, materials, manufacturing, and operation will be presented. Data interpretation will also be discussed, since a modern discussion of plugging data interpretation does not currently exist. (authors)

  10. An AFM-based methodology for measuring axial and radial error motions of spindles

    Science.gov (United States)

    Geng, Yanquan; Zhao, Xuesen; Yan, Yongda; Hu, Zhenjiang

    2014-05-01

    This paper presents a novel atomic force microscopy (AFM)-based methodology for measurement of axial and radial error motions of a high precision spindle. Based on a modified commercial AFM system, the AFM tip is employed as a cutting tool by which nano-grooves are scratched on a flat surface with the rotation of the spindle. By extracting the radial motion data of the spindle from the scratched nano-grooves, the radial error motion of the spindle can be calculated after subtracting the tilting errors from the original measurement data. Through recording the variation of the PZT displacement in the Z direction in AFM tapping mode during the spindle rotation, the axial error motion of the spindle can be obtained. Moreover the effects of the nano-scratching parameters on the scratched grooves, the tilting error removal method for both conditions and the method of data extraction from the scratched groove depth are studied in detail. The axial error motion of 124 nm and the radial error motion of 279 nm of a commercial high precision air bearing spindle are achieved by this novel method, which are comparable with the values provided by the manufacturer, verifying this method. This approach does not need an expensive standard part as in most conventional measurement approaches. Moreover, the axial and radial error motions of the spindle can both be obtained, indicating that this is a potential means of measuring the error motions of the high precision moving parts of ultra-precision machine tools in the future.

  11. Characterization methodology for Difficult To Measure nuclides in the Type B rad waste from the ITER

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Youngyong; Hong, Kwonpyo; Oh, Wanho; Kang, Munja; Na, Byungchan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-10-15

    In general, it is not possible to directly detect beta rays from the rad waste in the field measurement due to their extremely low penetration through the materials. Only lab-scale measurements with proper shield and detecting system are available for the nondestructive assay. However, the disposal sites in many countries require the determination of inventories of the difficult to-measure (DTM) nuclides in the waste before their acceptance for disposal. Many sites that generate rad wastes thus are adapting the indirect method to characterize the DTM nuclides in the rad waste to be disposed. The rad waste from the operation of an international thermonuclear experimental reactor (ITER) will be sent to the hot cell building (HCB) after packing it to the basket and they are then treated into the disposal form as well as characterized through the nondestructive assay. The rad waste properties from the ITER are that high density material such as a steel, a copper, and a tungsten accounts for the main substance and many nuclides due to the neutron irradiation including the DTM nuclides exists in that waste. Therefore, the ITER is also facing with the problem for the characterization of DTM nuclides. The scaling factor for the radiological relationship between the gamma and the beta nuclides is one of the indirect measurements to characterize the DTM nuclides in the waste. The methodology of the scaling factor to apply this method to the characterization the Type B rad waste from the ITER are presented in this paper. There are several types of the in-vessel components (IVCs) in a Tokamak which will be activated by neutron and they will be divided into different types of the rad waste such as the divertor cassette, blanket module, and port plugs. In this paper, the characterization of DTM nuclides will be focused on the rad waste from a blanket module out of IVCs.

  12. Double Chooz Improved Multi-Detector Measurements

    CERN Document Server

    CERN. Geneva

    2016-01-01

    The Double Chooz experiment (DC) is a reactor neutrino oscillation experiment running at Chooz nuclear power plant (2 reactors) in France. In 2011, DC first reported indication of non-zero θ13 with the far detector (FD) located at the maximum of oscillation effects (i.e. disappearance), thus challenging the CHOOZ non-observation limit. A robust observation of θ13 followed in 2012 by the Daya Bay experiments with multiple detector configurations. Since 2015 DC runs in a multi-detector configuration making thus the impact of several otherwise dominating systematics reduce strongly. DC’s unique almost "iso-flux" site, allows the near detector (ND) to become a direct accurate non-oscillation reference to the FD. Our first multi-detector results at MORIOND-2016 showed an intriguing deviation of θ13 with respect to the world average. We will address this issue in this seminar. The combined "reactor-θ13" measurement is expected to ...

  13. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist

    OpenAIRE

    Terwee, C. B.; Mokkink, L.B.; Knol, D L; Ostelo, R. W. J. G.; Bouter, L. M.; Vet, de, E.

    2011-01-01

    Background The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5–18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. Methods The scoring system was devel...

  14. Use of Balanced Scorecard Methodology for Performance Measurement of the Health Extension Program in Ethiopia.

    Science.gov (United States)

    Teklehaimanot, Hailay D; Teklehaimanot, Awash; Tedella, Aregawi A; Abdella, Mustofa

    2016-05-04

    In 2004, Ethiopia introduced a community-based Health Extension Program to deliver basic and essential health services. We developed a comprehensive performance scoring methodology to assess the performance of the program. A balanced scorecard with six domains and 32 indicators was developed. Data collected from 1,014 service providers, 433 health facilities, and 10,068 community members sampled from 298 villages were used to generate weighted national, regional, and agroecological zone scores for each indicator. The national median indicator scores ranged from 37% to 98% with poor performance in commodity availability, workforce motivation, referral linkage, infection prevention, and quality of care. Indicator scores showed significant difference by region (P < 0.001). Regional performance varied across indicators suggesting that each region had specific areas of strength and deficiency, with Tigray and the Southern Nations, Nationalities and Peoples Region being the best performers while the mainly pastoral regions of Gambela, Afar, and Benishangul-Gumuz were the worst. The findings of this study suggest the need for strategies aimed at improving specific elements of the program and its performance in specific regions to achieve quality and equitable health services.

  15. An Efficient Identification Methodology for Improved Access to Music Heritage Collections

    Directory of Open Access Journals (Sweden)

    Nicola Montecchio

    2012-04-01

    Full Text Available A comprehensive methodology for automatic music identification is presented. The main   application of the proposed approach is to provide tools to enrich and validate the   descriptors of recordings digitized by a sound archive institution. Experimentation has been carried out on three different datasets, including a collection   of digitized vinyl discs, although the methodology is not linked to a particular   recording carrier.  Automatic identification allows a music digital library to retrieve metadata about music works even if the information was incomplete or missing at the time   of the acquisition. Automatic segmentation of digitized material is obtained as a byproduct of identification, allowing the music digital library to grant access to individual tracks, even if  discs are digitized using a single file for a complete disc side. Results show that the approach is both efficient and effective.

  16. Improving the uncertainty of photomask linewidth measurements

    Science.gov (United States)

    Pedulla, J. M.; Potzick, James; Silver, Richard M.

    2004-05-01

    The National Institute of Standards and Technology (NIST) is currently developing a photomask linewidth standard (SRM 2059) with a lower expected uncertainty of calibration than the previous NIST standards (SRMs 473, 475, 476). In calibrating these standards, optical simulation modeling has been used to predict the microscope image intensity profiles, which are then compared to the experimental profiles to determine the certified linewidths. Consequently, the total uncertainty in the linewidth calibration is a result of uncertainty components from the optical simulation modeling and uncertainty due to experimental errors or approximations (e.g., tool imaging errors and material characterization errors). Errors of approximation in the simulation model and uncertainty in the parameters used in the model can contribute a large component to the total linewidth uncertainty. We have studied the effects of model parameter variation on measurement uncertainty using several different optical simulation programs that utilize different mathematical techniques. We have also evaluated the effects of chrome edge runout and varying indices of refraction on the linewidth images. There are several experimental parameters that are not ordinarily included in the modeling simulation. For example, the modeling programs assume a uniform illuminating field (e.g., Koehler illumination), ideal optics and perfect optical alignment. In practice, determining whether Koehler illumination has been achieved is difficult, and the optical components and their alignments are never ideal. We will present some techniques for evaluating Koehler illumination and methods to compensate for scattered (flare) light. Any such experimental elements, that are assumed accurate in the modeling, may actually present significant components to the uncertainty and need to be quantitatively estimated. The present state of metrology does not permit the absolute calibration of linewidth standards to the level of

  17. An Efficient and Improved Methodology for the Screening of Industrially Valuable Xylano-Pectino-Cellulolytic Microbes

    OpenAIRE

    2015-01-01

    Xylano-pectino-cellulolytic enzymes are valuable enzymes of the industrial sector. In our earlier study, we have reported a novel and cost effective methodology for the qualitative screening of cellulase-free xylano-pectinolytic microorganisms by replacing the commercial, highly expensive substrates with agricultural residues, but the microorganisms with xylanolytic, pectinolytic, cellulolytic, xylano-pectinolytic, xylano-cellulolytic, pectino-cellulolytic, and xylano-pectino-cellulolytic pot...

  18. Fine coal measurement needs for improved control

    Energy Technology Data Exchange (ETDEWEB)

    Firth, B.; O' Brien, M. [CSIRO, Brisbane, Qld. (Australia). Division of Energy Technology

    2010-07-01

    The monitoring and management of fine coal circuits in coal preparation plants is limited in current practice. As part of the Australian Coal Association Research Program (ACARP) Intelligent Plant Project (C11069), the relationships between the main operational and control factors for the unit operations and the circuit and the performance indicators have been identified. The unit operations examined included desliming (hydrocyclones and sieve bends), small coal cleaning (spirals and hydraulic separators), flotation, and dewatering (vacuum filters, centrifuges, and thickeners). These relationships were then used to assist in the identification of the important parameters to be measured and the preferred level of accuracy required to be useful. An important issue was the interconnection between the various unit operations and the potential impact of an upstream problem on the subsequent performance of downstream units. Analysis with the relationships showed that the flow rate of respective feed slurries and the solids content were found to be significant variables. This article will discuss this analysis and provide some case studies.

  19. Current psychometric and methodological issues in the measurement of overgeneral autobiographical memory.

    Science.gov (United States)

    Griffith, James W; Sumner, Jennifer A; Raes, Filip; Barnhofer, Thorsten; Debeer, Elise; Hermans, Dirk

    2012-12-01

    Autobiographical memory is a multifaceted construct that is related to psychopathology and other difficulties in functioning. Across many studies, a variety of methods have been used to study autobiographical memory. The relationship between overgeneral autobiographical memory (OGM) and psychopathology has been of particular interest, and many studies of this cognitive phenomenon rely on the Autobiographical Memory Test (AMT) to assess it. In this paper, we examine several methodological approaches to studying autobiographical memory, and focus primarily on methodological and psychometric considerations in OGM research. We pay particular attention to what is known about the reliability, validity, and methodological variations of the AMT. The AMT has adequate psychometric properties, but there is great variability in methodology across studies that use it. Methodological recommendations and suggestions for future studies are presented.

  20. Improved sapflow methodology reveals considerable night-time ozone uptake by Mediterranean species

    Directory of Open Access Journals (Sweden)

    S. Mereu

    2009-12-01

    Full Text Available Due to the evident tropospheric ozone impact on plant productivity, an accurate ozone risk assessment for the vegetation has become an issue. There is a growing evidence that ozone stomatal uptake may also take place at night and that the night-time uptake may be more damaging than diurnal uptake. Estimation of night-time uptake in the field is complicated because of instrumental difficulties. Eddy covariance technology is not always reliable because of the low turbulence at night. Leaf level porometry is defective at relative humidity above 70% which often takes place at night. Improved sap flow technology allows to estimate also slow flows that usually take place at night and hence may be, at present, the most trustworthy technology to measure night-time transpiration and hence to derive canopy stomatal conductance and ozone uptake at night. Based on micrometeorological data and the sap flow of three Mediterranean woody species, the night-time ozone uptake of these species was evaluated during a summer season as drought increased. Night-time ozone uptake was from 10% to 18% of the total daily uptake when plants were exposed to a weak drought, but increased up to 24% as the drought became more pronounced. The percentage increase is due to a stronger reduction of diurnal stomatal conductance than night-time stomatal conductance.

  1. A methodology to determine the level of automation to improve the production process and reduce the ergonomics index

    Science.gov (United States)

    Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo

    2017-08-01

    Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.

  2. Improving timeliness and efficiency in the referral process for safety net providers: application of the Lean Six Sigma methodology.

    Science.gov (United States)

    Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A

    2010-01-01

    Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.

  3. THE UNCERTAINTIES OF ENVIRONMENT'S PARAMETERS MEASUREMENTS AS TOLLS OF THE MEASUREMENTS QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Miroslav Badida

    2008-06-01

    Full Text Available Identification of the noise measuring uncertainties by declared measured values is unconditionally necessary and required by legislative. Uncertainty of the measurements expresses all errors that accrue during the measuring. B y indication of uncertainties the measure documents that the objective value is with certain probability found in the interval that is bounded by the measurement uncertainty. The paper deals with the methodology of the uncertainty calculation by noise measurements in living and working environments. metal processing industry and building materials industry.

  4. A Improved Method of Measurement Helicity Value of Neutrino

    Institute of Scientific and Technical Information of China (English)

    Ye Zipiao

    2000-01-01

    The Goldhaber's experimental result is analyzed in this paper. The improved method of measureing helicity value of the neutrino put forward here can greatly enhance the accuracy of measurement result and evidently reduce the experimental error.

  5. IMPROVED ACCURACY AND ROUGHNESS MEASURES FOR ROUGH SETS

    Institute of Scientific and Technical Information of China (English)

    Zhou Yuming; Xu Baowen

    2002-01-01

    Accuracy and roughness, proposed by Pawlak(1982), might draw a conclusion inconsistent with our intuition in some cases. This letter analyzes the limitations in these measures and proposes improved accuracy and roughness measures based on information theory.

  6. The quality infrastructure measuring, analyzing, and improving library services

    CERN Document Server

    Murphy, Sarah Anne

    2013-01-01

    Summarizing specific tools for measuring service quality alongside tips for using these tools most effectively, this book helps libraries of all kinds take a programmatic approach to measuring, analyzing, and improving library services.

  7. Thermal-Diffusivity Measurements of Mexican Citrus Essential Oils Using Photoacoustic Methodology in the Transmission Configuration

    Science.gov (United States)

    Muñoz, G. A. López; González, R. F. López; López, J. A. Balderas; Martínez-Pérez, L.

    2011-05-01

    Photoacoustic methodology in the transmission configuration (PMTC) was used to study the thermophysical properties and their relation with the composition in Mexican citrus essential oils providing the viability of using photothermal techniques for quality control and for authentication of oils and their adulteration. Linear relations for the amplitude (on a semi-log scale) and phase, as functions of the sample's thickness, for the PMTC was obtained through a theoretical model fit to the experimental data for thermal-diffusivity measurements in Mexican orange, pink grapefruit, mandarin, lime type A, centrifuged essential oils, and Mexican distilled lime essential oil. Gas chromatography for distilled lime essential oil and centrifuged lime essential oil type A is reported to complement the study. Experimental results showed close thermal-diffusivity values between Mexican citrus essential oils obtained by centrifugation, but a significant difference of this physical property for distilled lime oil and the corresponding value obtained by centrifugation, which is due to their different chemical compositions involved with the extraction processes.

  8. Encouraging Environmentally Friendlier Cars via Fiscal Measures: General Methodology and Application to Belgium

    Directory of Open Access Journals (Sweden)

    Joeri Van Mierlo

    2013-01-01

    Full Text Available In this paper, a Belgian tax reform plan is elaborated to respond to the EU proposal that requires member states to restructure passenger car taxation systems, preferentially based on the CO2 emissions of the car. A tax orientation on CO2 emissions alone might however favour diesel vehicles, characterised by a higher fuel efficiency, whereas they release more polluting emissions (PM and NOx than comparable gasoline vehicles. This paper introduces a methodology, the Ecoscore, as a potential tax assessment basis. The Ecoscore is based on a well-to-wheel framework and enables a comparison of the environmental burden caused by vehicles with different drive trains and using different fuels. A new proposal for a fixed vehicle taxation system, based on the Ecoscore, is launched. In addition, its impact on the life cycle cost of conventional as well as alternative fuelled cars is measured in order to examine its steering effect towards a cleaner vehicle choice. The overall result is that current tax distortions can be corrected by restructuring the vehicle registration tax and annual circulation tax, based on the Ecoscore. To stimulate behavioural changes, such a fiscal policy should however be paired with additional policies that act on the other important aspects that determine the car purchase decision.

  9. A METHODOLOGY TO INTEGRATE MAGNETIC RESONANCE AND ACOUSTIC MEASUREMENTS FOR RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Jorge O. Parra; Chris L. Hackert; Lorna L. Wilson

    2002-09-20

    The work reported herein represents the third year of development efforts on a methodology to interpret magnetic resonance and acoustic measurements for reservoir characterization. In this last phase of the project we characterize a vuggy carbonate aquifer in the Hillsboro Basin, Palm Beach County, South Florida, using two data sets--the first generated by velocity tomography and the second generated by reflection tomography. First, we integrate optical macroscopic (OM), scanning electron microscope (SEM) and x-ray computed tomography (CT) images, as well as petrography, as a first step in characterizing the aquifer pore system. This pore scale integration provides information with which to evaluate nuclear magnetic resonance (NMR) well log signatures for NMR well log calibration, interpret ultrasonic data, and characterize flow units at the field scale between two wells in the aquifer. Saturated and desaturated NMR core measurements estimate the irreducible water in the rock and the variable T{sub 2} cut-offs for the NMR well log calibration. These measurements establish empirical equations to extract permeability from NMR well logs. Velocity and NMR-derived permeability and porosity relationships integrated with velocity tomography (based on crosswell seismic measurements recorded between two wells 100 m apart) capture two flow units that are supported with pore scale integration results. Next, we establish a more detailed picture of the complex aquifer pore structures and the critical role they play in water movement, which aids in our ability to characterize not only carbonate aquifers, but reservoirs in general. We analyze petrography and cores to reveal relationships between the rock physical properties that control the compressional and shear wave velocities of the formation. A digital thin section analysis provides the pore size distributions of the rock matrix, which allows us to relate pore structure to permeability and to characterize flow units at the

  10. Snow mapping from MODIS products: the application of an improved cloud removal methodology to the Po river basin

    Science.gov (United States)

    Da Ronco, Pierfrancesco; De Michele, Carlo

    2014-05-01

    Digital snow maps are a powerful tool for reproducing large-scale snow distribution and extension. The use of such information for hydrological purposes is now considered an outlet of great practical interest: when combined with local assessments or measurements of the snow water equivalent (SWE), it allows to estimate the regional snow resource. In this context, MODIS (MODerate resolution Imaging Spectroradiometer on board Terra and Aqua satellites) daily Snow Covered Area product has been widely tested and proved to be appropriate for hydrologic applications. However, within a daily map the presence of cloudiness can hide the ground, thus preventing any snow detection. On the basis of previous studies, we recently developed a new methodology for cloud removal able to deal with the problem in wide areas, characterized by an high topographical and geomorphological heterogeneity such as northern Italy. Given the Aqua/Terra daily snow map of the basin, the standard condition shows a cloud-free part and a cloud-covered part. The latter is the assessment area, where a stepped procedure for cloud reduction combines temporal and spatial information obtained from neighboring areas to estimate whether there is snow. While conceiving the new method, our first target was to preserve the daily temporal resolution of the product as far as possible. In cases when there were not enough information on the same day within the cloud-free part, or in the nearest days, we adopted an improved method which ensures an acceptable reproduction of the micro-cycles which characterize the transition altitudes (where snow does not stand continually over the entire winter). Daily binary (snow/not snow) maps of ten years (2003-2012) have been analyzed and processed with the support of a Digital Elevation Model (DEM) of the basin with 500 m spatial resolution. We deeply investigated the issue of cloudiness over the study period, highlighting its dependence on altitude and season. Snow maps seem

  11. Single Case Method in Psychology: How to Improve as a Possible Methodology in Quantitative Research.

    Science.gov (United States)

    Krause-Kjær, Elisa; Nedergaard, Jensine I

    2015-09-01

    Awareness of including Single-Case Method (SCM), as a possible methodology in quantitative research in the field of psychology, has been argued as useful, e.g., by Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Their article introduces a historical and conceptual analysis of SCMs and proposes changing the, often prevailing, tendency of neglecting SCM as an alternative to Null Hypothesis Significance Testing (NHST). This article contributes by putting a new light on SCM as an equally important methodology in psychology. The intention of the present article is to elaborate this point of view further by discussing one of the most fundamental requirements as well as main characteristics of SCM regarding temporality. In this respect that; "…performance is assessed continuously over time and under different conditions…" Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Defining principles when it comes to particular units of analysis, both synchronic (spatial) and diachronic (temporal) elements should be incorporated. In this article misunderstandings of the SCM will be adduced, and further the temporality will be described in order to propose how the SCM could have a more severe usability in psychological research. It is further discussed how to implement SCM in psychological methodology. It is suggested that one solution might be to reconsider the notion of time in psychological research to cover more than a variable of control and in this respect also include the notion of time as an irreversible unity within life.

  12. Integration of Value Stream Map and Healthcare Failure Mode and Effect Analysis into Six Sigma Methodology to Improve Process of Surgical Specimen Handling

    Directory of Open Access Journals (Sweden)

    Sheng-Hui Hung

    2015-01-01

    Full Text Available Specimen handling is a critical patient safety issue. Problematic handling process, such as misidentification (of patients, surgical site, and specimen counts, specimen loss, or improper specimen preparation can lead to serious patient harms and lawsuits. Value stream map (VSM is a tool used to find out non-value-added works, enhance the quality, and reduce the cost of the studied process. On the other hand, healthcare failure mode and effect analysis (HFMEA is now frequently employed to avoid possible medication errors in healthcare process. Both of them have a goal similar to Six Sigma methodology for process improvement. This study proposes a model that integrates VSM and HFMEA into the framework, which mainly consists of define, measure, analyze, improve, and control (DMAIC, of Six Sigma. A Six Sigma project for improving the process of surgical specimen handling in a hospital was conducted to demonstrate the effectiveness of the proposed model.

  13. Integration of Value Stream Map and Healthcare Failure Mode and Effect Analysis into Six Sigma Methodology to Improve Process of Surgical Specimen Handling.

    Science.gov (United States)

    Hung, Sheng-Hui; Wang, Pa-Chun; Lin, Hung-Chun; Chen, Hung-Ying; Su, Chao-Ton

    2015-01-01

    Specimen handling is a critical patient safety issue. Problematic handling process, such as misidentification (of patients, surgical site, and specimen counts), specimen loss, or improper specimen preparation can lead to serious patient harms and lawsuits. Value stream map (VSM) is a tool used to find out non-value-added works, enhance the quality, and reduce the cost of the studied process. On the other hand, healthcare failure mode and effect analysis (HFMEA) is now frequently employed to avoid possible medication errors in healthcare process. Both of them have a goal similar to Six Sigma methodology for process improvement. This study proposes a model that integrates VSM and HFMEA into the framework, which mainly consists of define, measure, analyze, improve, and control (DMAIC), of Six Sigma. A Six Sigma project for improving the process of surgical specimen handling in a hospital was conducted to demonstrate the effectiveness of the proposed model.

  14. An improved water footprint methodology linking global consumption to local water resources: a case of Spanish tomatoes.

    Science.gov (United States)

    Chapagain, A K; Orr, S

    2009-02-01

    A water footprint (WF) measures the total water consumed by a nation, business or individual by calculating the total water used during the production of goods and services. This paper extends the existing methods for WF to more localised levels for crops grown partly in open systems and partly in plastic-covered houses with multi-seasonal harvesting, such as the horticulture industry in Spain. This improvement makes it possible to visualise the links of EU tomato consumption to precise production sites in Spain and opens a debate to the usefulness of such findings. This paper also compares existing ecological methodologies with WF and argues that both life cycle analysis (LCA) and ecological footprint (EF) models could benefit from WF methods. Our results show that the EU consumes 957,000 tons of Spanish fresh tomatoes annually, which evaporates 71 Mm(3)/yr of water and would require 7 Mm(3)/yr of water to dilute leached nitrates in Spain. In Spain, tomato production alone evaporates 297 Mm(3)/yr and pollutes 29 Mm(3)/yr of freshwater. Depending upon the local agro-climatic character, status of water resources, total tomato production volumes and production system, the impact of EU consumption of fresh tomatoes on Spanish freshwater is very location specific. The authors suggest that business now seek to report and address negative impacts on the environment. WF opens the door to complex water relationships and provides vital information for policy actors, business leaders, regulators and managers to their draw, dependence and responsibilities on this increasingly scarce resource.

  15. Improvement in Product Development: Use of back-end data to support upstream efforts of Robust Design Methodology

    Directory of Open Access Journals (Sweden)

    Vanajah Siva

    2012-12-01

    Full Text Available In the area of Robust Design Methodology (RDM less is done on how to use and work with data from the back-end of the product development process to support upstream improvement. The purpose of this paper is to suggest RDM practices for the use of customer claims data in early design phases as a basis for improvements. The back-end data, when systematically analyzed and fed back into the product development process, aids in closing the product development loop from claims to improvement in the design phase. This is proposed through a flow of claims data analysis tied to an existing tool, namely Failure Mode and Effects Analysis (FMEA. The systematic and integrated analysis of back-end data is suggested as an upstream effort of RDM to increase understanding of noise factors during product usage based on the feedback of claims data to FMEA and to address continuous improvement in product development.

  16. Disaggregating Fossil Fuel Emissions from Biospheric Fluxes: Methodological Improvements for Inverse Methods

    Science.gov (United States)

    Yadav, V.; Shiga, Y. P.; Michalak, A. M.

    2012-12-01

    The accurate spatio-temporal quantification of fossil fuel emissions is a scientific challenge. Atmospheric inverse models have the capability to overcome this challenge and provide estimates of fossil fuel emissions. Observational and computational limitations limit current analyses to the estimations of a combined "biospheric flux and fossil-fuel emissions" carbon dioxide (CO2) signal, at coarse spatial and temporal resolution. Even in these coarse resolution inverse models, the disaggregation of a strong biospheric signal form a weaker fossil-fuel signal has proven difficult. The use of multiple tracers (delta 14C, CO, CH4, etc.) has provided a potential path forward, but challenges remain. In this study, we attempt to disaggregate biospheric fluxes and fossil-fuel emissions on the basis of error covariance models rather through tracer based CO2 inversions. The goal is to more accurately define the underlying structure of the two processes by using a stationary exponential covariance model for the biospheric fluxes, in conjunction with a semi-stationary covariance model derived from nightlights for fossil fuel emissions. A non-negativity constraint on fossil fuel emissions is imposed using a data transformation approach embedded in an iterative quasi-linear inverse modeling algorithm. The study is performed for January and June 2008, using the ground-based CO2 measurement network over North America. The quality of disaggregation is examined by comparing the inferred spatial distribution of biospheric fluxes and fossil-fuel emissions in a synthetic-data inversion. In addition to disaggregation of fluxes, the ability of the covariance models derived from nightlights to explain the fossil-fuel emissions over North America is also examined. The simple covariance model proposed in this study is found to improve estimation and disaggregation of fossil-fuel emissions from biospheric fluxes in the tracer-based inverse models.

  17. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    Science.gov (United States)

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information

  18. Novel methodology for accurate resolution of fluid signatures from multi-dimensional NMR well-logging measurements.

    Science.gov (United States)

    Anand, Vivek

    2017-03-01

    A novel methodology for accurate fluid characterization from multi-dimensional nuclear magnetic resonance (NMR) well-logging measurements is introduced. This methodology overcomes a fundamental challenge of poor resolution of features in multi-dimensional NMR distributions due to low signal-to-noise ratio (SNR) of well-logging measurements. Based on an unsupervised machine-learning concept of blind source separation, the methodology resolves fluid responses from simultaneous analysis of large quantities of well-logging data. The multi-dimensional NMR distributions from a well log are arranged in a database matrix that is expressed as the product of two non-negative matrices. The first matrix contains the unique fluid signatures, and the second matrix contains the relative contributions of the signatures for each measurement sample. No a priori information or subjective assumptions about the underlying features in the data are required. Furthermore, the dimensionality of the data is reduced by several orders of magnitude, which greatly simplifies the visualization and interpretation of the fluid signatures. Compared to traditional methods of NMR fluid characterization which only use the information content of a single measurement, the new methodology uses the orders-of-magnitude higher information content of the entire well log. Simulations show that the methodology can resolve accurate fluid responses in challenging SNR conditions. The application of the methodology to well-logging data from a heavy oil reservoir shows that individual fluid signatures of heavy oil, water associated with clays and water in interstitial pores can be accurately obtained. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Scalar mixing and strain dynamics methodologies for PIV/LIF measurements of vortex ring flows

    Science.gov (United States)

    Bouremel, Yann; Ducci, Andrea

    2017-01-01

    Fluid mixing operations are central to possibly all chemical, petrochemical, and pharmaceutical industries either being related to biphasic blending in polymerisation processes, cell suspension for biopharmaceuticals production, and fractionation of complex oil mixtures. This work aims at providing a fundamental understanding of the mixing and stretching dynamics occurring in a reactor in the presence of a vortical structure, and the vortex ring was selected as a flow paradigm of vortices commonly encountered in stirred and shaken reactors in laminar flow conditions. High resolution laser induced fluorescence and particle imaging velocimetry measurements were carried out to fully resolve the flow dissipative scales and provide a complete data set to fully assess macro- and micro-mixing characteristics. The analysis builds upon the Lamb-Oseen vortex work of Meunier and Villermaux ["How vortices mix," J. Fluid Mech. 476, 213-222 (2003)] and the engulfment model of Baldyga and Bourne ["Simplification of micromixing calculations. I. Derivation and application of new model," Chem. Eng. J. 42, 83-92 (1989); "Simplification of micromixing calculations. II. New applications," ibid. 42, 93-101 (1989)] which are valid for diffusion-free conditions, and a comparison is made between three methodologies to assess mixing characteristics. The first method is commonly used in macro-mixing studies and is based on a control area analysis by estimating the variation in time of the concentration standard deviation, while the other two are formulated to provide an insight into local segregation dynamics, by either using an iso-concentration approach or an iso-concentration gradient approach to take into account diffusion.

  20. Methodological factors influencing measurement and processing of plasma reelin in humans

    Directory of Open Access Journals (Sweden)

    Keller Flavio

    2003-09-01

    Full Text Available Abstract Background Reelin, intensively studied as an extracellular protein that regulates brain development, is also expressed in a variety of tissues and a circulating pool of reelin exists in adult mammals. Here we describe the methodological and biological foundation for carrying out and interpreting clinical studies of plasma reelin. Results Reelin in human plasma was sensitive to proteolysis, freeze-thawing and heating during long-term storage, sample preparation and electrophoresis. Reelin in plasma was a dimer under denaturing conditions. Boiling of samples resulted in laddering, suggesting that each of the 8 repeats expressed in reelin contains a heat-labile covalent bond susceptible to breakage. Urinary-type and tissue-type plasminogen activator converted reelin to a discrete 310 kDa fragment co-migrating with the major immunoreactive reelin fragment seen in plasma and also detected in brain. (In contrast, plasmin produced a spectrum of smaller unstable reelin fragments. We examined archival plasma of 10 pairs of age-matched male individuals differing in repeat length of a CGG repeat polymorphism of the 5'-untranslated region of the reelin gene (both alleles 11 repeats. Reelin 310 kDa band content was lower in subjects having the long repeats in all 10 pairs, by 25% on average (p Conclusions Our studies indicate the need for caution in measuring reelin in archival blood samples, and suggest that assays of plasma reelin should take into account three dimensions that might vary independently: a the total amount of reelin protein; b the relative amounts of reelin vs. its proteolytic processing products; and c the aggregation state of the native protein. Reelin-plasminogen activator interactions may affect their roles in synaptic plasticity. Our results also suggest that the human CGG repeat polymorphism affects reelin gene expression, and may affect susceptibility to human disease.

  1. USING A NEW SUPPLY CHAIN PLANNING METHODOLOGY TO IMPROVE SUPPLY CHAIN EFFICIENCY

    Directory of Open Access Journals (Sweden)

    A.L.V. Raubenheimer

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Effective supply chain planning and management has emerged as one of the most challenging opportunities for companies in the global economy during the last decade or two. This article reviews the evolution of Supply Chain Management and the traditional Supply Chain Solutions. It then introduces a new Supply Chain Planning methodology in which simulation modelling plays an important value-adding role to help organisations understand the dynamics of their Supply Chains.

    AFRIKAANSE OPSOMMING:Effektiewe voorsieningskettingbeplanning en –bestuur het gedurende die laaste twee dekades ontwikkel tot een van die mees uitdagende geleenthede vir ondernemings in die wêreldekonomie. Hierdie artikel hersien kortliks die ontwikkeling van voorsieningskettingbestuur en die tradisionele oplossings. ‘n Nuwe voorsieningskettingbeplanningsmetodologie word dan voorgestel en bespreek waarin simulasiemodellering ‘n belangrike rol speel om ondernemings te help om die dinamika van hul voorsieningskettings te begryp.

  2. Improved Methodology of Weather Window Prediction for Offshore Operations Based on Probabilities of Operation Failure

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    2017-01-01

    window estimates would result in better wind farm accessibility predictions and, as a consequence, potentially reduce the cost of offshore wind energy. This paper presents an updated methodology of weather window prediction that uses physical offshore vessel and equipment responses to establish...... already contribute significantly to the cost of produced electricity and will continue to increase, due to moving further offshore, if the current techniques of predicting offshore wind farm accessibility are to stay the same. The majority of offshore operations are carried out by specialized ships...... that must be hired for the duration of the operation. Therefore, offshore wind farm accessibility and costs of offshore activities are primarily driven by the expected number of operational hours offshore and waiting times for weather windows, suitable for offshore operations. Having more reliable weather...

  3. Measurement in Learning Games Evolution: Review of Methodologies Used in Determining Effectiveness of "Math Snacks" Games and Animations

    Science.gov (United States)

    Trujillo, Karen; Chamberlin, Barbara; Wiburg, Karin; Armstrong, Amanda

    2016-01-01

    This article captures the evolution of research goals and methodologies used to assess the effectiveness and impact of a set of mathematical educational games and animations for middle-school aged students. The researchers initially proposed using a mixed model research design of formative and summative measures, such as user-testing,…

  4. Measurement in Learning Games Evolution: Review of Methodologies Used in Determining Effectiveness of "Math Snacks" Games and Animations

    Science.gov (United States)

    Trujillo, Karen; Chamberlin, Barbara; Wiburg, Karin; Armstrong, Amanda

    2016-01-01

    This article captures the evolution of research goals and methodologies used to assess the effectiveness and impact of a set of mathematical educational games and animations for middle-school aged students. The researchers initially proposed using a mixed model research design of formative and summative measures, such as user-testing,…

  5. Measurements of Intracellular Ca2+ Content and Phosphatidylserine Exposure in Human Red Blood Cells: Methodological Issues

    Directory of Open Access Journals (Sweden)

    Mauro C. Wesseling

    2016-06-01

    Full Text Available Background/Aims: The increase of the intracellular Ca2+ content as well as the exposure of phosphatidylserine (PS on the outer cell membrane surface after activation of red blood cells (RBCs by lysophosphatidic acid (LPA has been investigated by a variety of research groups. Carrying out experiments, which we described in several previous publications, we observed some discrepancies when comparing data obtained by different investigators within our research group and also between batches of LPA. In addition, we found differences comparing the results of double and single labelling experiments (for Ca2+ and PS. Furthermore, the results of PS exposure depended on the fluorescent dye used (annexin V-FITC versus annexin V alexa fluor® 647. Therefore, it seems necessary to investigate these methodological approaches in more detail to be able to quantify results and to compare data obtained by different research groups. Methods: The intracellular Ca2+ content and the PS exposure of RBCs separated from whole blood have been investigated after treatment with LPA (2.5 µM obtained from three different companies (Sigma-Aldrich, Cayman Chemical Company, and Santa Cruz Biotechnology Inc.. Fluo-4 and x-rhod-1 have been used to detect intracellular Ca2+ content, annexin V alexa fluor® 647 and annexin V-FITC have been used for PS exposure measurements. Both parameters (Ca2+ content, PS exposure were studied using flow cytometry and fluorescence microscopy. Results: The percentage of RBCs showing increased intracellular Ca2+ content as well as PS exposure changes significantly between different LPA manufacturers as well as on the condition of mixing of LPA with the RBC suspension. Furthermore, the percentage of RBCs showing PS exposure is reduced in double labelling compared to single labelling experiments and depends also on the fluorescent dye used. Finally, data on Ca2+ content are slightly affected whereas PS exposure data are not affected significantly

  6. An in-situ soil structure characterization methodology for measuring soil compaction

    Science.gov (United States)

    Dobos, Endre; Kriston, András; Juhász, András; Sulyok, Dénes

    2016-04-01

    The agricultural cultivation has several direct and indirect effects on the soil properties, among which the soil structure degradation is the best known and most detectable one. Soil structure degradation leads to several water and nutrient management problems, which reduce the efficiency of agricultural production. There are several innovative technological approaches aiming to reduce these negative impacts on the soil structure. The tests, validation and optimization of these methods require an adequate technology to measure the impacts on the complex soil system. This study aims to develop an in-situ soil structure and root development testing methodology, which can be used in field experiments and which allows one to follow the real time changes in the soil structure - evolution / degradation and its quantitative characterization. The method is adapted from remote sensing image processing technology. A specifically transformed A/4 size scanner is placed into the soil into a safe depth that cannot be reached by the agrotechnical treatments. Only the scanner USB cable comes to the surface to allow the image acquisition without any soil disturbance. Several images from the same place can be taken throughout the vegetation season to follow the soil consolidation and structure development after the last tillage treatment for the seedbed preparation. The scanned image of the soil profile is classified using supervised image classification, namely the maximum likelihood classification algorithm. The resulting image has two principal classes, soil matrix and pore space and other complementary classes to cover the occurring thematic classes, like roots, stones. The calculated data is calibrated with filed sampled porosity data. As the scanner is buried under the soil with no changes in light conditions, the image processing can be automated for better temporal comparison. Besides the total porosity each pore size fractions and their distributions can be calculated for

  7. Six Sigma Methodology Utilization in Telecom Sector for Quality Improvement- A DMAIC Process

    Directory of Open Access Journals (Sweden)

    MANISH BHARGAVA,

    2010-12-01

    Full Text Available This article presents tools of Six Sigma for Telecom Industries; these can achieve powerful operational improvements that produce sustainable business benefits. Six Sigma Qualtec’s dedicated Six Sigma for Telecom practice is specifically designed to help traditional and modern telecommunications providers, become more efficient in their operating procedures. By learning and implementing improvements such as Voice of the Customer (VOC, , Six Sigma, Business Process Management Design for Six Sigma and Lean Enterprise principles, those companies will be able to dramatically improve the way they do business thus attracting and keeping customers in this hyper-competitive industry. This paper maps some of the changes in the telecom markets that resulted from competitive entry and givesan insight into the dynamics of competitive markets in relation to quality improvement. Additionally, the presentation seeks to demonstrate that in the quest for the particular competitive outcome via independent and transparent regulation.

  8. METHODOLOGY FOR THE DEVELOPMENT OF RFID VALUE ADDED SERVICES TO IMPROVE SUPPLY CHAIN OPERATIONS

    OpenAIRE

    Sobottka, Thomas; Leitner, René; Sihn, Wilfried

    2012-01-01

    Despite the potential of RFID technologies to improve supply chain operations and control, RFID projects – especially in production heavy sectors – are often restricted to basic recognition functionalities, leaving much of the vast potential to improve supply chain operations and control unharnessed. This paper aims at providing a procedure to identify advanced value added services based on the RFID technology, tailor made to companies’ specific requirements. Considering all relevant processe...

  9. Mercury methylation and reduction potentials in marine water: An improved methodology using {sup 197}Hg radiotracer

    Energy Technology Data Exchange (ETDEWEB)

    Koron, Neza [National Institute of Biology, Marine Biology Station, Fornace 41, 6330 Piran (Slovenia); Bratkic, Arne [Department of Environmental Sciences, ' Jozef Stefan' Institute, Jamova 39, 1000 Ljubljana (Slovenia); Ribeiro Guevara, Sergio, E-mail: ribeiro@cab.cnea.gov.ar [Laboratorio de Analisis por Activacion Neutronica, Centro Atomico Bariloche, Av. Bustillo km 9.5, 8400 Bariloche (Argentina); Vahcic, Mitja; Horvat, Milena [Department of Environmental Sciences, ' Jozef Stefan' Institute, Jamova 39, 1000 Ljubljana (Slovenia)

    2012-01-15

    A highly sensitive laboratory methodology for simultaneous determination of methylation and reduction of spiked inorganic mercury (Hg{sup 2+}) in marine water labelled with high specific activity radiotracer ({sup 197}Hg prepared from enriched {sup 196}Hg stable isotope) was developed. A conventional extraction protocol for methylmercury (CH{sub 3}Hg{sup +}) was modified in order to significantly reduce the partitioning of interfering labelled Hg{sup 2+} into the final extract, thus allowing the detection of as little as 0.1% of the Hg{sup 2+} spike transformed to labelled CH{sub 3}Hg{sup +}. The efficiency of the modified CH{sub 3}Hg{sup +} extraction procedure was assessed by radiolabelled CH{sub 3}Hg{sup +} spikes corresponding to concentrations of methylmercury between 0.05 and 4 ng L{sup -1}. The recoveries were 73.0{+-}6.0% and 77.5{+-}3.9% for marine and MilliQ water, respectively. The reduction potential was assessed by purging and trapping the radiolabelled elemental Hg in a permanganate solution. The method allows detection of the reduction of as little as 0.001% of labelled Hg{sup 2+} spiked to natural waters. To our knowledge, the optimised methodology is among the most sensitive available to study the Hg methylation and reduction potential, therefore allowing experiments to be done at spikes close to natural levels (1-10 ng L{sup -1}). - Highlights: Black-Right-Pointing-Pointer Inorganic mercury methylation and reduction in marine water were studied. Black-Right-Pointing-Pointer High specific activity {sup 197}Hg was used to label Hg{sup 2+} spikes at natural levels. Black-Right-Pointing-Pointer Methylmercury extraction had 73% efficiency for 0.05-4 ng L{sup -1} levels. Black-Right-Pointing-Pointer High sensibility to assess methylation potentials, below 0.1% of the spike. Black-Right-Pointing-Pointer High sensibility also for reduction potentials, as low as 0.001% of the spike.

  10. Impact of lean six sigma process improvement methodology on cardiac catheterization laboratory efficiency.

    Science.gov (United States)

    Agarwal, Shikhar; Gallo, Justin J; Parashar, Akhil; Agarwal, Kanika K; Ellis, Stephen G; Khot, Umesh N; Spooner, Robin; Murat Tuzcu, Emin; Kapadia, Samir R

    2016-03-01

    Operational inefficiencies are ubiquitous in several healthcare processes. To improve the operational efficiency of our catheterization laboratory (Cath Lab), we implemented a lean six sigma process improvement initiative, starting in June 2010. We aimed to study the impact of lean six sigma implementation on improving the efficiency and the patient throughput in our Cath Lab. All elective and urgent cardiac catheterization procedures including diagnostic coronary angiography, percutaneous coronary interventions, structural interventions and peripheral interventions performed between June 2009 and December 2012 were included in the study. Performance metrics utilized for analysis included turn-time, physician downtime, on-time patient arrival, on-time physician arrival, on-time start and manual sheath-pulls inside the Cath Lab. After implementation of lean six sigma in the Cath Lab, we observed a significant improvement in turn-time, physician downtime, on-time patient arrival, on-time physician arrival, on-time start as well as sheath-pulls inside the Cath Lab. The percentage of cases with optimal turn-time increased from 43.6% in 2009 to 56.6% in 2012 (p-trendsix sigma, on improving and sustaining efficiency of our Cath Lab operation. After the successful implementation of this continuous quality improvement initiative, there was a significant improvement in the selected performance metrics namely turn-time, physician downtime, on-time patient arrival, on-time physician arrival, on-time start as well as sheath-pulls inside the Cath Lab. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. A METHODOLOGY TO INTEGRATE MAGNETIC RESONANCE AND ACOUSTIC MEASUREMENTS FOR RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Jorge O. Parra; Chris L. Hackert; Lorna L. Wilson

    2002-09-20

    The work reported herein represents the third year of development efforts on a methodology to interpret magnetic resonance and acoustic measurements for reservoir characterization. In this last phase of the project we characterize a vuggy carbonate aquifer in the Hillsboro Basin, Palm Beach County, South Florida, using two data sets--the first generated by velocity tomography and the second generated by reflection tomography. First, we integrate optical macroscopic (OM), scanning electron microscope (SEM) and x-ray computed tomography (CT) images, as well as petrography, as a first step in characterizing the aquifer pore system. This pore scale integration provides information with which to evaluate nuclear magnetic resonance (NMR) well log signatures for NMR well log calibration, interpret ultrasonic data, and characterize flow units at the field scale between two wells in the aquifer. Saturated and desaturated NMR core measurements estimate the irreducible water in the rock and the variable T{sub 2} cut-offs for the NMR well log calibration. These measurements establish empirical equations to extract permeability from NMR well logs. Velocity and NMR-derived permeability and porosity relationships integrated with velocity tomography (based on crosswell seismic measurements recorded between two wells 100 m apart) capture two flow units that are supported with pore scale integration results. Next, we establish a more detailed picture of the complex aquifer pore structures and the critical role they play in water movement, which aids in our ability to characterize not only carbonate aquifers, but reservoirs in general. We analyze petrography and cores to reveal relationships between the rock physical properties that control the compressional and shear wave velocities of the formation. A digital thin section analysis provides the pore size distributions of the rock matrix, which allows us to relate pore structure to permeability and to characterize flow units at the

  12. Application of Response Surface Methodology for the Technological Improvement of Solid Lipid Nanoparticles.

    Science.gov (United States)

    Dal Pizzol, Carine; O'Reilly, Andre; Winter, Evelyn; Sonaglio, Diva; de Campos, Angela Machado; Creczynski-Pasa, Tânia Beatriz

    2016-02-01

    Solid lipid nanoparticles (SLN) are colloidal particles consisting of a matrix composed of solid (at room and body temperatures) lipids dispersed in aqueous emulsifier solution. During manufacture, their physicochemical properties may be affected by several formulation parameters, such as type and concentration of lipid, proportion of emulsifiers and amount of solvent. Thus, the aim of this work was to study the influence of these variables on the preparation of SLN. A D-optimal Response Surface Methodology design was used to establish a mathematical model for the optimization of SLN. A total of 30 SLN formulations were prepared using the ultrasound method, and then characterized on the basis of their physicochemical properties, including particle size, polydispersity index (PI) and Zeta Potential (s). Particle sizes ranged between 107 and 240 nm. All SLN formulations showed negative sigma and PI values below 0.28. Prediction of the optimal conditions was performed using the desirability function targeting the reduction of all responses. The optimized SLN formulation showed similar theoretical and experimental values, confirming the sturdiness and predictive ability of the mathematical model for SLN optimization.

  13. Redesigning service delivery for hypertensive patients: a methodological guideline to improve the management of chronic diseases.

    Science.gov (United States)

    Ippolito, Adelaide; Cannavacciuolo, Lorella; Ponsiglione, Cristina; De Luca, Nicola; Iaccarino, Guido; Illario, Maddalena

    2014-04-01

    Best care is not necessarily the most expensive, but the most appropriate, and prevention is the most powerful tool to promote health. A novel approach might envision the reduction of hospital admittance (thus meeting a requirement from long term condition patients: they would rather not being hospitalized!) and the enforcement of peripheral (both on the territory and at home) assistance. In this direction, experiences of reshaping new service deliveries towards an integrated disease management, namely clinical pathways, can be observed in Europe and in different parts of the world. Aim of this paper is to provide a methodological guideline to support the management in planning clinical pathways, also outlining the main barriers limiting the process. In particular, we present the results of planning a clinical pathway at the Centre for Hypertension of the Federico II University Hospital (Naples, Italy). The case study showed that the introduction of a similar service impacts on the organisation of the structure. An analysis of organizational processes "as are" and the re-design of processes "to be" are necessary to integrate the clinical pathway into the actual activities.

  14. IMPROVING PSYCHOMOTRICITY COMPONENTS IN PRESCHOOL CHILDREN USING TEACHING METHODOLOGIES BASED ON MIRROR NEURONS ACTIVATION

    Directory of Open Access Journals (Sweden)

    Gáll Zs. Sz.

    2015-08-01

    Full Text Available The scientific substrate of the study relies upon the concept of mirror neurons. Unlike other neurons, these are characterized by an imitation feature. They play an important role in learning processes – especially during childhood, enabling the imitation of motions and determining the primary acquirement thereof. Using this as a starting point, the study aims to work out and apply a methodology in keeping with the content of the psychomotor expression activities curriculum for preschool education, resorting to the demonstration procedures as a main teaching-learning method. Thus, we deem that mirror neurons reactivity will be determined more thoroughly, with a view to enhance the subject's psychomotor development according to body scheme, self-image and performance of basic postures and motions. For the research progress, an experimental group and a control group has been set up and the children’s psychomotor development level has been assessed both before the application of the independent variable and after the effects of the same upon the experimental group. As soon as the planned procedure was completed, the experimental group members showed a significant evolution in terms of the investigated psychomotor fields as compared to the control group.

  15. Development of an improved methodology to detect infectious airborne influenza virus using the NIOSH bioaerosol sampler.

    Science.gov (United States)

    Cao, G; Noti, J D; Blachere, F M; Lindsley, W G; Beezhold, D H

    2011-12-01

    A unique two-stage cyclone bioaerosol sampler has been developed at NIOSH that can separate aerosols into three size fractions. The ability of this sampler to collect infectious airborne viruses from a calm-air chamber loaded with influenza A virus was tested. The sampler's efficiency at collecting aerosolized viral particles from a calm-air chamber is essentially the same as that from the high performance SKC BioSampler that collects un-fractionated particles directly into a liquid media (2.4 × 10(4) total viral particles per liter of sampled air (TVP/L) versus 2.6 × 10(4) TVP/L, respectively, after 15 min) and the efficiency is relatively constant over collection times of 15, 30 and 60 min. Approximately 34% of the aerosolized infectious virus collected after 15 min with the NIOSH bioaerosol sampler remained infectious, and infectious virus was found in all three size fractions. After 60 min of sampling, the infectious virus/liter air found in the NIOSH bioaerosol sampler was 15% of that found in the SKC BioSampler. This preservation of infectivity by the NIOSH bioaerosol sampler was maintained even when the initial infectivity prior to aerosolization was as low as 0.06%. The utility of the NIOSH bioaerosol sampler was further extended by incorporating an enhanced infectivity detection methodology developed in our laboratory, the viral replication assay, which amplified the infectious virus making it more readily detectable.

  16. Computationally Improved Optimal Control Methodology for Linear Programming Problems of Flexible Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Yen-Liang Pan

    2013-01-01

    Full Text Available Deadlock prevention policies are used to solve the deadlock problems of FMSs. It is well known that the theory of regions is the efficient method for obtaining optimal (i.e., maximally permissive controllers. All legal and live maximal behaviors of Petri net models can be preserved by using marking/transition-separation instances (MTSIs or event-state-separation-problem (ESSP methods. However, they encountered great difficulties in solving all sets of inequalities that is an extremely time consuming problem. Moreover, the number of linear programming problems (LPPs of legal markings is also exponential with net size when a plant net grows exponentially. This paper proposes a novel methodology to reduce the number of MTSIs/ESSPs and LPPs. In this paper, we used the well-known reduction approach Murata (1989 to simply the construct of system such that the problem of LPPs can then be reduced. Additionally, critical ones of crucial marking/transition-separation instances (COCMTSI are developed and used in our deadlock prevention policy that allows designers to employ few MTSIs to deal with deadlocks. Experimental results indicate that the computational cost can be reduced. To our knowledge, this deadlock prevention policy is the most efficient policy to obtain maximal permissive behavior of Petri net models than past approaches.

  17. Application of kaizen methodology to foster departmental engagement in quality improvement.

    Science.gov (United States)

    Knechtges, Paul; Decker, Michael Christopher

    2014-12-01

    The Toyota Production System, also known as Lean, is a structured approach to continuous quality improvement that has been developed over the past 50 years to transform the automotive manufacturing process. In recent years, these techniques have been successfully applied to quality and safety improvement in the medical field. One of these techniques is kaizen, which is the Japanese word for "good change." The central tenant of kaizen is the quick analysis of the small, manageable components of a problem and the rapid implementation of a solution with ongoing, real-time reassessment. Kaizen adds an additional "human element" that all stakeholders, not just management, must be involved in such change. Because of the small size of the changes involved in a kaizen event and the inherent focus on human factors and change management, a kaizen event can serve as good introduction to continuous quality improvement for a radiology department. Copyright © 2014. Published by Elsevier Inc.

  18. Probiotics production and alternative encapsulation methodologies to improve their viabilities under adverse environmental conditions.

    Science.gov (United States)

    Coghetto, Chaline Caren; Brinques, Graziela Brusch; Ayub, Marco Antônio Záchia

    2016-12-01

    Probiotic products are dietary supplements containing live microorganisms producing beneficial health effects on the host by improving intestinal balance and nutrient absorption. Among probiotic microorganisms, those classified as lactic acid bacteria are of major importance to the food and feed industries. Probiotic cells can be produced using alternative carbon and nitrogen sources, such as agroindustrial residues, at the same time contributing to reduce process costs. On the other hand, the survival of probiotic cells in formulated food products, as well as in the host gut, is an essential nutritional aspect concerning health benefits. Therefore, several cell microencapsulation techniques have been investigated as a way to improve cell viability and survival under adverse environmental conditions, such as the gastrointestinal milieu of hosts. In this review, different aspects of probiotic cells and technologies of their related products are discussed, including formulation of culture media, and aspects of cell microencapsulation techniques required to improve their survival in the host.

  19. Clinical Performance Measures and Quality Improvement System Considerations for Dental Education.

    Science.gov (United States)

    Parkinson, Joseph W; Zeller, Gregory G

    2017-03-01

    Quality improvement and quality assurance programs are an integral part of providing excellence in health care delivery. The Dental Quality Alliance and the Commission on Dental Accreditation recognize this and have created standards and recommendations to advise health care providers and health care delivery systems, including dental schools, on measuring the quality of the care delivered to patients. Overall health care expenditures have increased, and the Affordable Care Act has made health care, including dentistry, available to more people in the United States. These increases in cost and in the number of patients accessing care contribute to a heightened interest in measurable quality improvement outcomes that reflect efficiency, effectiveness, and overall value. Practitioners and administrators, both in academia and in the "real world," need an understanding of various quality improvement methodologies available in order to select approaches that support effective monitoring of the quality of care delivered. This article compares and contrasts various quality improvement approaches, programs, and systems currently in use in order to assist dental providers and administrators in choosing quality improvement methodologies pertinent to their practice or institution.

  20. Improved data evaluation methodology for energy ranges with missing experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Konobeyev, Yu.A.; Fischer, U. [Karlsruhe Institute of Technology (KIT), Inst. for Neutron Physics and Reactor Technology, Eggenstein-Leopoldshafen (Germany); Capote, R. [International Atomic Energy Agency, Nuclear Data Section, Vienna (Austria)

    2015-07-15

    A number of improvements of the data evaluation concerning nuclear model calculations and evaluation procedures are considered. A promising combination of the hybrid Monte Carlo simulation model of M. Blann for the modelling of non-equilibrium particle emission and the Hauser-Feshbach model, and a possible correction of simulations using intranuclear cascade evaporation model are discussed. In the last case the modelling of particle emission is improved by the consideration of nucleon-cluster interactions. The approach for reliable predictions of cross-sections using ''optimal'' nuclear model parameters and the method of their generation is discussed.

  1. The impact of multi-criteria performance measurement on business performance improvement

    Directory of Open Access Journals (Sweden)

    Fentahun Moges Kasie

    2013-06-01

    Full Text Available Purpose: The purpose of this paper is to investigate the relationship between multi-criteria performance measurement (MCPM practice and business performance improvement using the raw data collected from 33 selected manufacturing companies. In addition, it proposes modified MCPM model as an effective approach to improve business performance of manufacturing companies. Design/methodology/approach:Research paper. Primary and secondary data were collected using questionnaire survey, interview and observation of records. The methodology is to evaluate business performances of sampled manufacturing companies and the extent of utilization of crucial non-financial (lagging and non-financial (leading performance measures. The positive correlation between financial business performance and practice of MCPM is clearly shown using Pearson’s correlation coefficient analysis. Findings –This research paper indicates that companies which measure their performance using important financial and non-financial measures achieve better business performance. Even though certain companies are currently using non-financial measures, the researchers have learned that these financial measures were not integrated with each other, financial measures and strategic objectives. Research limitations/implications: The limitation of this paper is that the number of surveyed companies is small to make generalization and they are found in a single country. Further researches which incorporate a large number of companies from various developing nations are suggested to minimize the limitation of this research.Practical Implication: The paper shows that multi-dimensional performance measures with the inclusion of key leading indicator are essential to predict the future environment. But cost-accounting based financial measures are inadequate to do so. These are shown practically using Pearson’s correlation coefficient analysis. Originality/value: The significance of multi

  2. [New methodology for heavy metals measurement in water samples by PGNAA-XRF].

    Science.gov (United States)

    Jia, Wen-Bao; Zhang, Yan; Hei, Da-Qian; Ling, Yong-Sheng; Shan, Qing; Cheng, Can

    2014-11-01

    In the present paper, a new combined detection method was proposed using prompt gamma neutron activation analysis (PGNAA) and characteristic X-ray fluorescence to improve the heavy metals measurement accuracy for in-situ environmental water rejects analysis by PGNAA technology. Especially, the characteristic X-ray fluorescence (XRF) of heavy metals is induced by prompt gamma-ray directly instead of the traditional excitation sources. Thus, a combined measurement facility with an 241 AmBe neutron source, a BGO detector and a NaI-Be detector was developed to analyze the pollutants in water. The two detectors were respectively used to record prompt gamma-ray and characteristic X-ray fluorescence of heavy metals. The prompt gamma-ray intensity (I(γ)) and characteristic X-ray fluorescence intensity (I(x)) was determined by MCNP calculations for different concentration (c(i)) of chromium (Cr), cadmium (Cd), mercury (Hg) and lead (Pb), respectively. The simulation results showed that there was a good linear relationship between I(γ), I(x) and (c(i)), respectively. The empirical formula of combined detection method was given based on the above calculations. It was found that the combined detection method was more sensitive for high atomic number heavy metals like Hg and Pb measurement than low atomic number like Cr and Cd by comparing and analyzing I(γ) and I(x). The limits of detection for Hg and Pb by the combined measurement instrument were 17.4 and 24.2 mg x kg(-1), respectively.

  3. Methodological considerations for global analysis of cellular FLIM/FRET measurements

    Science.gov (United States)

    Adbul Rahim, Nur Aida; Pelet, Serge; Kamm, Roger D.; So, Peter T. C.

    2012-02-01

    Global algorithms can improve the analysis of fluorescence energy transfer (FRET) measurement based on fluorescence lifetime microscopy. However, global analysis of FRET data is also susceptible to experimental artifacts. This work examines several common artifacts and suggests remedial experimental protocols. Specifically, we examined the accuracy of different methods for instrument response extraction and propose an adaptive method based on the mean lifetime of fluorescent proteins. We further examined the effects of image segmentation and a priori constraints on the accuracy of lifetime extraction. Methods to test the applicability of global analysis on cellular data are proposed and demonstrated. The accuracy of global fitting degrades with lower photon count. By systematically tracking the effect of the minimum photon count on lifetime and FRET prefactors when carrying out global analysis, we demonstrate a correction procedure to recover the correct FRET parameters, allowing us to obtain protein interaction information even in dim cellular regions with photon counts as low as 100 per decay curve.

  4. Leveraging Competency Framework to Improve Teaching and Learning: A Methodological Approach

    Science.gov (United States)

    Shankararaman, Venky; Ducrot, Joelle

    2016-01-01

    A number of engineering education programs have defined learning outcomes and course-level competencies, and conducted assessments at the program level to determine areas for continuous improvement. However, many of these programs have not implemented a comprehensive competency framework to support the actual delivery and assessment of an…

  5. Аccounting and methodological aspects of capital expenditure for land improvement

    Directory of Open Access Journals (Sweden)

    J.P. Melnychuk

    2016-07-01

    Full Text Available The article highlights the process of reflection in accounting the capital costs for land improvement. The main legislation governing this issue is covered. Also the article has agreed the key issues that ensure in accounting for capital expenditures for farmland improving. The survey has benefited such general scientific methods as: induction and deduction, dialectic, historical and systematic methods and specific methods of accounting. Due to the land reform the ownership of the land was changed. Lands which were owned by farms have been privatized and have received a particular owner. Now privatized lands constitute a significant part of farmland. The land managers require quality accounting information about composition and state of the land and improvements that occur to make an effective management. The numerous changes in legislation generate controversies in their interpretation and, consequently, it results in appearance of the discrepancies in the conduct of cost accounting for capital land improvement which will effect on the amount of net profit in future. The article reflects the economic substance of the process and fundamentally describes the implementation method of accounting for capital expenditure for land in accordance with the applicable law.

  6. Towards a global CO2 calculation standard for supply chains: Suggestions for methodological improvements

    NARCIS (Netherlands)

    Davydenko, I.; Ehrler, V.; Ree, D. de; Lewis, A.; Tavasszy, L.

    2014-01-01

    Improving the efficiency and sustainability of supply chains is a shared aim of the transport industry, its customers, governments as well as industry organisations. To optimize supply chains and for the identification of best practice, standards for their analysis are needed in order to achieve

  7. Towards a global CO2 calculation standard for supply chains: Suggestions for methodological improvements

    NARCIS (Netherlands)

    Davydenko, I.; Ehrler, V.; Ree, D. de; Lewis, A.; Tavasszy, L.

    2014-01-01

    Improving the efficiency and sustainability of supply chains is a shared aim of the transport industry, its customers, governments as well as industry organisations. To optimize supply chains and for the identification of best practice, standards for their analysis are needed in order to achieve com

  8. The International Index of Erectile Function: a methodological critique and suggestions for improvement.

    Science.gov (United States)

    Yule, Morag; Davison, Joyce; Brotto, Lori

    2011-01-01

    The International Index of Erectile Function is a well-worded and psychometrically valid self-report questionnaire widely used as the standard for the evaluation of male sexual function. However, some conceptual and statistical problems arise when using the measure with men who are not sexually active. These problems are illustrated using 2 empirical examples, and the authors provide recommended solutions to further strengthen the efficacy and validity of this measure.

  9. Validity and reliability of using photography for measuring knee range of motion: a methodological study

    Directory of Open Access Journals (Sweden)

    Adie Sam

    2011-04-01

    Full Text Available Abstract Background The clinimetric properties of knee goniometry are essential to appreciate in light of its extensive use in the orthopaedic and rehabilitative communities. Intra-observer reliability is thought to be satisfactory, but the validity and inter-rater reliability of knee goniometry often demonstrate unacceptable levels of variation. This study tests the validity and reliability of measuring knee range of motion using goniometry and photographic records. Methods Design: Methodology study assessing the validity and reliability of one method ('Marker Method' which uses a skin marker over the greater trochanter and another method ('Line of Femur Method' which requires estimation of the line of femur. Setting: Radiology and orthopaedic departments of two teaching hospitals. Participants: 31 volunteers (13 arthritic and 18 healthy subjects. Knee range of motion was measured radiographically and photographically using a goniometer. Three assessors were assessed for reliability and validity. Main outcomes: Agreement between methods and within raters was assessed using concordance correlation coefficient (CCCs. Agreement between raters was assessed using intra-class correlation coefficients (ICCs. 95% limits of agreement for the mean difference for all paired comparisons were computed. Results Validity (referenced to radiographs: Each method for all 3 raters yielded very high CCCs for flexion (0.975 to 0.988, and moderate to substantial CCCs for extension angles (0.478 to 0.678. The mean differences and 95% limits of agreement were narrower for flexion than they were for extension. Intra-rater reliability: For flexion and extension, very high CCCs were attained for all 3 raters for both methods with slightly greater CCCs seen for flexion (CCCs varied from 0.981 to 0.998. Inter-rater reliability: For both methods, very high ICCs (min to max: 0.891 to 0.995 were obtained for flexion and extension. Slightly higher coefficients were obtained

  10. Improvements in analytical methodology for the determination of frequently consumed illicit drugs in urban wastewater.

    Science.gov (United States)

    Bijlsma, Lubertus; Beltrán, Eduardo; Boix, Clara; Sancho, Juan V; Hernández, Félix

    2014-07-01

    Rapid and sensitive analytical methodology based on ultra high-performance liquid chromatography-tandem mass spectrometry has been developed for the determination of widely consumed drugs of abuse (amphetamines, MDMA, cocaine, opioids, cannabis and ketamine) and their major metabolites in urban wastewaters. Sample clean-up and pre-concentration was performed by a generic off-line SPE procedure using Oasis HLB. Special effort was made to incorporate amphetamine, which was found highly problematic in the wastewater samples tested, including an additional clean-up with Oasis MCX SPE and dispersive primary secondary amine. Correction for possible SPE losses or degradation during storage was made by the use of isotope-labelled internal standards (ILIS), available for all compounds, which were added to the samples as surrogates. Although ILIS were also efficient for matrix effects correction, the strong ionization suppression observed was not eliminated; therefore, a four-fold dilution prior to SPE was applied to influent wastewaters and a low injection volume was selected (3 μL), in order to reach a compromise between matrix effects, chromatographic performance and sensitivity. The method was validated at 25 and 200 ng L(-1) (effluent), and 100 and 800 ng L(-1) (influent), obtaining limits of quantification (i.e. the lowest level that the compound can be quantified and also confirmed with at least two MS/MS transitions) between 0.4-25 ng L(-1) (effluent) and 2-100 ng L(-1) (influent). The applicability of the method was demonstrated by analysis of 14 influent and 14 effluent wastewater samples collected over 2 weeks in Castellón (Spain) within a European collaborative study.

  11. The Value of Improved Measurements in a Pig Slaughterhouse

    DEFF Research Database (Denmark)

    Kjærsgaard, Niels Christian

    investments are expected to improve the quality of the measurements further. This paper concerns the use of Operations Research to solve a practical problem, which is of major importance for the industry, namely to improve the estimation of the economic effects of improved measurements. The benefit...... markets. Therefore it is more important than ever to optimize all aspects of Danish pig production, slaughtering processes and delivery. This paper concerns the aspects of optimization at the slaughterhouses regarding estimation of the value of improved measurements. The slaughterhouse industry differs...... consisting of pigs with almost the same characteristics and thereby reducing the variation within the individual sorting groups substantially. The accuracy of the measurements is the most important limiting factor for how much the variation within each sorting group can actually be reduced. Substantial...

  12. Knowing the SCOR: using business metrics to gain measurable improvements.

    Science.gov (United States)

    Malin, Jane H

    2006-07-01

    By using the Supply Chain Operations Reference model, one New York hospital was able to define and measure its supply chains, determine the weak links in its processes, and identify necessary improvements.

  13. Measuring Team Shared Understanding Using the Analysis-Constructed Shared Mental Model Methodology

    Science.gov (United States)

    Johnson, Tristan E.; O'Connor, Debra L.

    2008-01-01

    Teams are an essential part of successful performance in learning and work environments. Analysis-constructed shared mental model (ACSMM) methodology is a set of techniques where individual mental models are elicited and sharedness is determined not by the individuals who provided their mental models but by an analytical procedure. This method…

  14. Behavioral Observation Scales for Measuring Children's Distress: The Effects of Increased Methodological Rigor.

    Science.gov (United States)

    Jay, Susan M.; Elliott, Charles

    1984-01-01

    Evaluated the effects of increased methodological rigor on the validity of the Observation Scale of Behavioral Distress and on findings concerning whether children habituate to painful procedures. Data were scored with and without refinements. Results indicated that children do habituate but that refinements had little effect on validity. (BH)

  15. The Measure of a Nation: The USDA and the Rise of Survey Methodology

    Science.gov (United States)

    Mahoney, Kevin T.; Baker, David B.

    2007-01-01

    Survey research has played a major role in American social science. An outgrowth of efforts by the United States Department of Agriculture in the 1930s, the Division of Program Surveys (DPS) played an important role in the development of survey methodology. The DPS was headed by the ambitious and entrepreneurial Rensis Likert, populated by young…

  16. Improved Temperature Sounding and Quality Control Methodology Using AIRS/AMSU Data: The AIRS Science Team Version 5 Retrieval Algorithm

    Science.gov (United States)

    Susskind, Joel; Blaisdell, John M.; Iredell, Lena; Keita, Fricky

    2009-01-01

    This paper describes the AIRS Science Team Version 5 retrieval algorithm in terms of its three most significant improvements over the methodology used in the AIRS Science Team Version 4 retrieval algorithm. Improved physics in Version 5 allows for use of AIRS clear column radiances in the entire 4.3 micron CO2 absorption band in the retrieval of temperature profiles T(p) during both day and night. Tropospheric sounding 15 micron CO2 observations are now used primarily in the generation of clear column radiances .R(sub i) for all channels. This new approach allows for the generation of more accurate values of .R(sub i) and T(p) under most cloud conditions. Secondly, Version 5 contains a new methodology to provide accurate case-by-case error estimates for retrieved geophysical parameters and for channel-by-channel clear column radiances. Thresholds of these error estimates are used in a new approach for Quality Control. Finally, Version 5 also contains for the first time an approach to provide AIRS soundings in partially cloudy conditions that does not require use of any microwave data. This new AIRS Only sounding methodology, referred to as AIRS Version 5 AO, was developed as a backup to AIRS Version 5 should the AMSU-A instrument fail. Results are shown comparing the relative performance of the AIRS Version 4, Version 5, and Version 5 AO for the single day, January 25, 2003. The Goddard DISC is now generating and distributing products derived using the AIRS Science Team Version 5 retrieval algorithm. This paper also described the Quality Control flags contained in the DISC AIRS/AMSU retrieval products and their intended use for scientific research purposes.

  17. A Simple and Efficient Methodology To Improve Geometric Accuracy in Gamma Knife Radiation Surgery: Implementation in Multiple Brain Metastases

    Energy Technology Data Exchange (ETDEWEB)

    Karaiskos, Pantelis, E-mail: pkaraisk@med.uoa.gr [Medical Physics Laboratory, Medical School, University of Athens (Greece); Gamma Knife Department, Hygeia Hospital, Athens (Greece); Moutsatsos, Argyris; Pappas, Eleftherios; Georgiou, Evangelos [Medical Physics Laboratory, Medical School, University of Athens (Greece); Roussakis, Arkadios [CT and MRI Department, Hygeia Hospital, Athens (Greece); Torrens, Michael [Gamma Knife Department, Hygeia Hospital, Athens (Greece); Seimenis, Ioannis [Medical Physics Laboratory, Medical School, Democritus University of Thrace, Alexandroupolis (Greece)

    2014-12-01

    Purpose: To propose, verify, and implement a simple and efficient methodology for the improvement of total geometric accuracy in multiple brain metastases gamma knife (GK) radiation surgery. Methods and Materials: The proposed methodology exploits the directional dependence of magnetic resonance imaging (MRI)-related spatial distortions stemming from background field inhomogeneities, also known as sequence-dependent distortions, with respect to the read-gradient polarity during MRI acquisition. First, an extra MRI pulse sequence is acquired with the same imaging parameters as those used for routine patient imaging, aside from a reversal in the read-gradient polarity. Then, “average” image data are compounded from data acquired from the 2 MRI sequences and are used for treatment planning purposes. The method was applied and verified in a polymer gel phantom irradiated with multiple shots in an extended region of the GK stereotactic space. Its clinical impact in dose delivery accuracy was assessed in 15 patients with a total of 96 relatively small (<2 cm) metastases treated with GK radiation surgery. Results: Phantom study results showed that use of average MR images eliminates the effect of sequence-dependent distortions, leading to a total spatial uncertainty of less than 0.3 mm, attributed mainly to gradient nonlinearities. In brain metastases patients, non-eliminated sequence-dependent distortions lead to target localization uncertainties of up to 1.3 mm (mean: 0.51 ± 0.37 mm) with respect to the corresponding target locations in the “average” MRI series. Due to these uncertainties, a considerable underdosage (5%-32% of the prescription dose) was found in 33% of the studied targets. Conclusions: The proposed methodology is simple and straightforward in its implementation. Regarding multiple brain metastases applications, the suggested approach may substantially improve total GK dose delivery accuracy in smaller, outlying targets.

  18. Assessment of potential improvements on regional air quality modelling related with implementation of a detailed methodology for traffic emission estimation.

    Science.gov (United States)

    Coelho, Margarida C; Fontes, Tânia; Bandeira, Jorge M; Pereira, Sérgio R; Tchepel, Oxana; Dias, Daniela; Sá, Elisa; Amorim, Jorge H; Borrego, Carlos

    2014-02-01

    The accuracy and precision of air quality models are usually associated with the emission inventories. Thus, in order to assess if there are any improvements on air quality regional simulations using detailed methodology of road traffic emission estimation, a regional air quality modelling system was applied. For this purpose, a combination of top-down and bottom-up approaches was used to build an emission inventory. To estimate the road traffic emissions, the bottom-up approach was applied using an instantaneous emission model (Vehicle Specific Power - VSP methodology), and an average emission model (CORINAIR methodology), while for the remaining activity sectors the top-down approach was used. Weather Research and Forecasting (WRF) and Comprehensive Air quality (CAMx) models were selected to assess two emission scenarios: (i) scenario 1, which includes the emissions from the top-down approach; and (ii) scenario 2, which includes the emissions resulting from integration of top-down and bottom-up approaches. The results show higher emission values for PM10, NOx and HC, for scenario 1, and an inverse behaviour to CO. The highest differences between these scenarios were observed for PM10 and HC, about 55% and 75% higher (respectively for each pollutant) than emissions provided by scenario 2. This scenario gives better results for PM10, CO and O3. For NO2 concentrations better results were obtained with scenario 1. Thus, the results obtained suggest that with the combination of the top-down and bottom-up approaches to emission estimation several improvements in the air quality results can be achieved, mainly for PM10, CO and O3.

  19. Mapping of the geogenic radon potential in France to improve radon risk management: methodology and first application to region Bourgogne

    Energy Technology Data Exchange (ETDEWEB)

    Ielsch, G., E-mail: geraldine.ielsch@irsn.f [Institut de Radioprotection et de Surete Nucleaire, DEI/SARG/BRN, BP 17, 92262 Fontenay-aux-Roses cedex (France); Cushing, M.E., E-mail: edward.cushing@irsn.f [Institut de Radioprotection et de Surete Nucleaire, DEI/SARG/BRN, BP 17, 92262 Fontenay-aux-Roses cedex (France); Combes, Ph., E-mail: philippe.combes@geoter.f [GEOTER SAS, Geologie Tectonique Environnement et Risques, 3, rue Jean Monnet, 34830 Clapiers (France); Cuney, M., E-mail: michel.cuney@g2r.uhp-nancy.f [CREGU et UMR G2R 7566, Universite Henri Poincare - NANCY I, Domaine Scientifique Victor Grignard, Entree 3B, BP 70 239 - F54 506 Vandoeuvre-les-Nancy Cedex (France)

    2010-10-15

    In order to improve regulatory tools for radon risk management in France, a harmonised methodology to derive a single map of the geogenic radon potential has been developed. This approach consists of determining the capacity of the geological units to produce radon and to facilitate its transfer to the atmosphere, based on the interpretation of existing geological data. This approach is firstly based on a classification of the geological units according to their uranium (U) content, to create a radon source potential map. This initial map is then improved by taking into account the main additional parameters, such as fault lines, which control the preferential pathways of radon through the ground and which can increase the radon levels in soils. The implementation of this methodology to the whole French territory is currently in progress. We present here the results obtained in one region (Bourgogne, Massif Central) which displays significant variations of the geogenic radon potential. The map obtained leads to a more precise zoning than the scale of the existing map of radon priority areas currently based solely on administrative boundaries.

  20. Methodologies to improve product life cycle decision making in the telecommunications industry

    OpenAIRE

    Mead, Carl Dennis

    2003-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. As pressure from regulation and customers increases on telecommunications equipment manufacturers and service providers to reduce the hazardous material content of telecommunications products and generally improve environmental performance, new methods for Product Life Cycle Management are required. Supplier and component environmental evaluation are vital and fundamental elements of any Prod...

  1. Related Measures on Improving the Teaching Quality of DGED Course

    Institute of Scientific and Technical Information of China (English)

    JIN Yi; SHAN Hong-bo; WANG Xiao-hong; YU Hai-yan; GE Bin

    2013-01-01

    Engineering Drawing course is one of the main contents of teaching at most of science and engineering colleges or univer-sities. In this paper, some feasible measures is discussed on improving the teaching quality of Engineering Drawing course from four aspects, including diversified teacher participation and coordinating the teaching process, optimizing the content of teaching and im-proving teaching quality, improving teaching effect and reforming teaching methods, and integrating practice and cultivating practi-cal ability.

  2. Preparation methodology and possible treatments for improved ceramics for high voltage vacuum applications

    CERN Document Server

    Tan, J

    1998-01-01

    The flashover characteristics of an insulator bridged high voltage vacuum gap can play an important role in the overall performance of a high voltage device, for example in the extreme environments of high energy particle accelerators. The detailed preparation of the insulators is, at present, governed by the commercial production methods and by standard bulk cleaning processes, which for a particular application may be far from optimum. The influence of the mechanical preparation, thermal history and particular cleaning technique have been investigated for commercially available alumina samples, with measurement of surface characteristics by scanning electron microscopy and laser diffraction, measurement of the secondary electron emission curve and analysis of the high voltage performance with the possibility of applied fields up to 200kV/cm. The results of the different measurements are discussed in the overall context of the problems encountered in the full sized high voltage devices, and suggestions are m...

  3. Fundamental and methodological investigations for the improvement of elemental analysis by inductively coupled plasma mass soectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, Christopher Hysjulien [Ames Lab., Ames, IA (United States)

    2012-01-01

    This dissertation describes a variety of studies meant to improve the analytical performance of inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation (LA) ICP-MS. The emission behavior of individual droplets and LA generated particles in an ICP is studied using a high-speed, high frame rate digital camera. Phenomena are observed during the ablation of silicate glass that would cause elemental fractionation during analysis by ICP-MS. Preliminary work for ICP torch developments specifically tailored for the improvement of LA sample introduction are presented. An abnormal scarcity of metal-argon polyatomic ions (MAr{sup +}) is observed during ICP-MS analysis. Evidence shows that MAr{sup +} ions are dissociated by collisions with background gas in a shockwave near the tip of the skimmer cone. Method development towards the improvement of LA-ICP-MS for environmental monitoring is described. A method is developed to trap small particles in a collodion matrix and analyze each particle individually by LA-ICP-MS.

  4. Hydrogen isotope measurement of bird feather keratin, one laboratory's response to evolving methodologies.

    Science.gov (United States)

    Fan, Majie; Dettman, David L

    2015-01-01

    Hydrogen in organic tissue resides in a complex mixture of molecular contexts. Some hydrogen, called non-exchangeable (H(non)), is strongly bound, and its isotopic ratio is fixed when the tissue is synthesized. Other pools of hydrogen, called exchangeable hydrogen (H(ex)), constantly exchange with ambient water vapor. The measurement of the δ(2)H(non) in organic tissues such as hair or feather therefore requires an analytical process that accounts for exchangeable hydrogen. In this study, swan feather and sheep wool keratin were used to test the effects of sample drying and capsule closure on the measurement of δ(2)H(non) values, and the rate of back-reaction with ambient water vapor. Homogenous feather or wool keratins were also calibrated at room temperature for use as control standards to correct for the effects of exchangeable hydrogen on feathers. Total δ(2)H values of both feather and wool samples showed large changes throughout the first ∼6 h of drying. Desiccant plus low vacuum seems to be more effective than room temperature vacuum pumping for drying samples. The degree of capsule closure affects exchangeable hydrogen equilibration and drying, with closed capsules responding more slowly. Using one control keratin standard to correct for the δ(2)H(ex) value for a batch of samples leads to internally consistent δ(2)H(non) values for other calibrated keratins run as unknowns. When placed in the context of other recent improvements in the measurement of keratin δ(2)H(non) values, we make recommendations for sample handing, data calibration and the reporting of results.

  5. Antimicrobial use metrics and benchmarking to improve stewardship outcomes: methodology, opportunities, and challenges.

    Science.gov (United States)

    Ibrahim, Omar M; Polk, Ron E

    2014-06-01

    Measurement of antimicrobial use before and after an intervention and the associated outcomes are key activities of antimicrobial stewardship programs. In the United States, the recommended metric for aggregate antibiotic use is days of therapy/1000 patient-days. Clinical outcomes, including response to therapy and bacterial resistance, are critical measures but are more difficult to document than economic outcomes. Interhospital benchmarking of risk adjusted antimicrobial use is possible, although several obstacles remain before it can have an impact on patient care. Many challenges for stewardship programs remain, but the methods and science to support their efforts are rapidly evolving.

  6. TO METHODOLOGY FOR IMPROVEMENT OF ACTIVE VIBRO-PROTECTION WHILE USING FUNCTIONAL DIAGNOSTICS

    Directory of Open Access Journals (Sweden)

    T. N. Mikulik

    2014-01-01

    Full Text Available The paper investigates vibro-protection conditions for “operator-chair” system of a transport facility (“Belarus-tractorfamily. Experimental  research for the system vibroloading with due account of elastic shock-absorbing characteristics, operator’s comfortability. The paper has made it possible to determine a range of the system vibration frequency which is badlysustained by the operator because the last is located in the zone of natural frequency of human visceral organs vibrationsInfluence of physiological operator’s factors – heartbeat rate, variational height, mode amplitude, stress index has been investigated on the basis of a factor experiment and correlation dependences have been obtained in the paper. The developed methodology for investigation of algorithmic provision pertaining to better active vibroprotection of the “operator-chair” system presupposes an availability of mathematical model used for synthesis of control laws and selection of algorithms for formation of signals on physiological operator’s state. Structural algorithm scheme for vibroprotection of “driver – seat – road” system has been drawn in the paper. Harmonic sinusoidal and poly-harmonic disturbances from the side of a power unit and discrete algorithms based on filtration of white noise with a linear filter and  prescribed correlation function have been accepted as a mathematical model for external environment disturbances. In case of harmonic excitation of “operator – chair” system  a force transferred to the system by a shock-absorber and also shock-absorber efficiency evaluation in the form of force transmission coefficient and vibration insulation value are estimated at decibels. Fourier’s series describes motion of the system in case of vibration forces initiated by the operation of the power unit. Piecewise-linear function describes a reaction on impact excitation of the system when final change in speed and motion

  7. Improvement of antibiotic activity of Xenorhabdus bovienii by medium optimization using response surface methodology

    Science.gov (United States)

    2011-01-01

    Background The production of secondary metabolites with antibiotic properties is a common characteristic to entomopathogenic bacteria Xenorhabdus spp. These metabolites not only have diverse chemical structures but also have a wide range of bioactivities with medicinal and agricultural interests such as antibiotic, antimycotic and insecticidal, nematicidal and antiulcer, antineoplastic and antiviral. It has been known that cultivation parameters are critical to the secondary metabolites produced by microorganisms. Even small changes in the culture medium may not only impact the quantity of certain compounds but also the general metabolic profile of microorganisms. Manipulating nutritional or environmental factors can promote the biosynthesis of secondary metabolites and thus facilitate the discovery of new natural products. This work was conducted to evaluate the influence of nutrition on the antibiotic production of X. bovienii YL002 and to optimize the medium to maximize its antibiotic production. Results Nutrition has high influence on the antibiotic production of X. bovienii YL002. Glycerol and soytone were identified as the best carbon and nitrogen sources that significantly affected the antibiotic production using one-factor-at-a-time approach. Response surface methodology (RSM) was applied to optimize the medium constituents (glycerol, soytone and minerals) for the antibiotic production of X. bovienii YL002. Higher antibiotic activity (337.5 U/mL) was obtained after optimization. The optimal levels of medium components were (g/L): glycerol 6.90, soytone 25.17, MgSO4·7H2O 1.57, (NH4)2SO4 2.55, KH2PO4 0.87, K2HPO4 1.11 and Na2SO4 1.81. An overall of 37.8% increase in the antibiotic activity of X. bovienii YL002 was obtained compared with that of the original medium. Conclusions To the best of our knowledge, there are no reports on antibiotic production of X. boviebii by medium optimization using RSM. The results strongly support the use of RSM for medium

  8. A methodology for improving road safety by novel infrastructural and invehicle technology combinations

    NARCIS (Netherlands)

    Wiethoff, M.; Brookhuis, K.A.; De Waard, D.; Marchau, V.A.W.J.; Walta, L.; Wenzel, G.; De Brucker, K.; Macharis, C.

    2012-01-01

    Introduction Still too many deaths and injuries are a result of road safety limitations within Europe. Road safety measures aimed to change the road environment to reduce the risks on driver errors and to reduce the seriousness of the effects of driver errors are expected to increase road safety. A

  9. Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems

    Science.gov (United States)

    2015-12-01

    xviii M&S modeling and simulation MDP model development process MEU Marine expeditionary unit MODA multi-objective decision analysis MOE measure of...objective decision analysis ( MODA ) 49 techniques. SMEs are still heavily involved in a MODA and have a method of tracing their values to the model

  10. Projecting future expansion of invasive species: comparing and improving methodologies for species distribution modeling.

    Science.gov (United States)

    Mainali, Kumar P; Warren, Dan L; Dhileepan, Kunjithapatham; McConnachie, Andrew; Strathie, Lorraine; Hassan, Gul; Karki, Debendra; Shrestha, Bharat B; Parmesan, Camille

    2015-12-01

    Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships for Parthenium hysterophorus L. (Asteraceae) with four modeling methods run with multiple scenarios of (i) sources of occurrences and geographically isolated background ranges for absences, (ii) approaches to drawing background (absence) points, and (iii) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved using a global dataset for model training, rather than restricting data input to the species' native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e., into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g., boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post hoc test conducted on a new Parthenium dataset from Nepal validated excellent predictive performance of our 'best' model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for parthenium

  11. Using statistical process control methodology to improve the safe operating envelope

    Energy Technology Data Exchange (ETDEWEB)

    Reeves, A.D.; Lunney, B.P.; McIntyre, C.M. [Atlantic Nuclear Services Ltd. (ANSL), Fredericton, New Brunswick (Canada); Prime, D.R. [New Brunswick Power Nuclear (NBPN), Lepreau, New Brunswick (Canada)

    2009-07-01

    Failure limits used to assess impairments from Operating Manual Tests (OMT) are often established using licensing limits from safety analysis. While these determine that licensing conditions are not violated, they do not provide pro-active indications of problems developing with system components. This paper discusses statistical process control (SPC) methods to define action limits useful in diagnosing system component problems prior to reaching impairment limits. Using data from a specific OMT, an example of one such application is provided. Application of SPC limits can provide an improvement to station operating economics through early detection of abnormal equipment behaviour. (author)

  12. An Improved Cambridge Filter Pad Extraction Methodology to Obtain More Accurate Water and “Tar” Values: In Situ Cambridge Filter Pad Extraction Methodology

    Directory of Open Access Journals (Sweden)

    Ghosh David

    2014-07-01

    Full Text Available Previous investigations by others and internal investigations at Philip Morris International (PMI have shown that the standard trapping and extraction procedure used for conventional cigarettes, defined in the International Standard ISO 4387 (Cigarettes -- Determination of total and nicotine-free dry particulate matter using a routine analytical smoking machine, is not suitable for high-water content aerosols. Errors occur because of water losses during the opening of the Cambridge filter pad holder to remove the filter pad as well as during the manual handling of the filter pad, and because the commercially available filter pad holder, which is constructed out of plastic, may adsorb water. This results in inaccurate values for the water content, and erroneous and overestimated values for Nicotine Free Dry Particulate Matter (NFDPM. A modified 44 mm Cambridge filter pad holder and extraction equipment which supports in situ extraction methodology has been developed and tested. The principle of the in situ extraction methodology is to avoid any of the above mentioned water losses by extracting the loaded filter pad while kept in the Cambridge filter pad holder which is hermetically sealed by two caps. This is achieved by flushing the extraction solvent numerous times through the hermetically sealed Cambridge filter pad holder by means of an in situ extractor. The in situ methodology showed a significantly more complete water recovery, resulting in more accurate NFDPM values for high-water content aerosols compared to the standard ISO methodology. The work presented in this publication demonstrates that the in situ extraction methodology applies to a wider range of smoking products and smoking regimens, whereas the standard ISO methodology only applies to a limited range of smoking products and smoking regimens, e.g., conventional cigarettes smoked under ISO smoking regimen. In cases where a comparison of yields between the PMI HTP and

  13. Improving students’ understanding of quantum measurement. I. Investigation of difficulties

    Directory of Open Access Journals (Sweden)

    Guangtian Zhu1,2

    2012-04-01

    Full Text Available We describe the difficulties that advanced undergraduate and graduate students have with quantum measurement within the standard interpretation of quantum mechanics. We explore the possible origins of these difficulties by analyzing student responses to questions from both surveys and interviews. Results from this research are applied to develop research-based learning tutorials to improve students’ understanding of quantum measurement.

  14. Reliability improvements on Thales RM2 rotary Stirling coolers: analysis and methodology

    Science.gov (United States)

    Cauquil, J. M.; Seguineau, C.; Martin, J.-Y.; Benschop, T.

    2016-05-01

    The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The cooler reliability is thus one of its most important parameters. This parameter has to increase to answer market needs. To do this, the data for identifying the weakest element determining cooler reliability has to be collected. Yet, data collection based on field are hardly usable due to lack of informations. A method for identifying the improvement in reliability has then to be set up which can be used even without field return. This paper will describe the method followed by Thales Cryogénie SAS to reach such a result. First, a database was built from extensive expertizes of RM2 failures occurring in accelerate ageing. Failure modes have then been identified and corrective actions achieved. Besides this, a hierarchical organization of the functions of the cooler has been done with regard to the potential increase of its efficiency. Specific changes have been introduced on the functions most likely to impact efficiency. The link between efficiency and reliability will be described in this paper. The work on the two axes - weak spots for cooler reliability and efficiency - permitted us to increase in a drastic way the MTTF of the RM2 cooler. Huge improvements in RM2 reliability are actually proven by both field return and reliability monitoring. These figures will be discussed in the paper.

  15. Improvement of personalized Monte Carlo-aided direct internal contamination monitoring: optimization of calculation times and measurement methodology for the establishment of activity distribution; Amelioration des mesures anthroporadiametriques personnalisees assistees par calcul Monte Carlo: optimisation des temps de calculs et methodologie de mesure pour l'etablissement de la repartition d'activite

    Energy Technology Data Exchange (ETDEWEB)

    Farah, Jad

    2011-10-06

    To optimize the monitoring of female workers using in vivo spectrometry measurements, it is necessary to correct the typical calibration coefficients obtained with the Livermore male physical phantom. To do so, numerical calibrations based on the use of Monte Carlo simulations combined with anthropomorphic 3D phantoms were used. Such computational calibrations require on the one hand the development of representative female phantoms of different size and morphologies and on the other hand rapid and reliable Monte Carlo calculations. A library of female torso models was hence developed by fitting the weight of internal organs and breasts according to the body height and to relevant plastic surgery recommendations. This library was next used to realize a numerical calibration of the AREVA NC La Hague in vivo counting installation. Moreover, the morphology-induced counting efficiency variations with energy were put into equation and recommendations were given to correct the typical calibration coefficients for any monitored female worker as a function of body height and breast size. Meanwhile, variance reduction techniques and geometry simplification operations were considered to accelerate simulations. Furthermore, to determine the activity mapping in the case of complex contaminations, a method that combines Monte Carlo simulations with in vivo measurements was developed. This method consists of realizing several spectrometry measurements with different detector positioning. Next, the contribution of each contaminated organ to the count is assessed from Monte Carlo calculations. The in vivo measurements realized at LEDI, CIEMAT and KIT have demonstrated the effectiveness of the method and highlighted the valuable contribution of Monte Carlo simulations for a more detailed analysis of spectrometry measurements. Thus, a more precise estimate of the activity distribution is given in the case of an internal contamination. (author)

  16. Gust factor based on research aircraft measurements: A new methodology applied to the Arctic marine boundary layer

    DEFF Research Database (Denmark)

    Suomi, Irene; Lüpkes, Christof; Hartmann, Jörg

    2016-01-01

    There is as yet no standard methodology for measuring wind gusts from a moving platform. To address this, we have developed a method to derive gusts from research aircraft data. First we evaluated four different approaches, including Taylor's hypothesis of frozen turbulence, to derive the gust...... in unstable conditions (R2=0.52). The mean errors for all methods were low, from -0.02 to 0.05, indicating that wind gust factors can indeed be measured from research aircraft. Moreover, we showed that aircraft can provide gust measurements within the whole boundary layer, if horizontal legs are flown...

  17. Improved and Reproducible Flow Cytometry Methodology for Nuclei Isolation from Single Root Meristem

    Directory of Open Access Journals (Sweden)

    Thaís Cristina Ribeiro Silva

    2010-01-01

    Full Text Available Root meristems have increasingly been target of cell cycle studies by flow cytometric DNA content quantification. Moreover, roots can be an alternative source of nuclear suspension when leaves become unfeasible and for chromosome analysis and sorting. In the present paper, a protocol for intact nuclei isolation from a single root meristem was developed. This proceeding was based on excision of the meristematic region using a prototypical slide, followed by short enzymatic digestion and mechanical isolation of nuclei during homogenization with a hand mixer. Such parameters were optimized for reaching better results. Satisfactory nuclei amounts were extracted and analyzed by flow cytometry, producing histograms with reduced background noise and CVs between 3.2 and 4.1%. This improved and reproducible technique was shown to be rapid, inexpensive, and simple for nuclear extraction from a single root tip, and can be adapted for other plants and purposes.

  18. Protocol for using mixed methods and process improvement methodologies to explore primary care receptionist work.

    Science.gov (United States)

    Litchfield, Ian; Gale, Nicola; Burrows, Michael; Greenfield, Sheila

    2016-11-16

    The need to cope with an increasingly ageing and multimorbid population has seen a shift towards preventive health and effective management of chronic disease. This places general practice at the forefront of health service provision with an increased demand that impacts on all members of the practice team. As these pressures grow, systems become more complex and tasks delegated across a broader range of staff groups. These include receptionists who play an essential role in the successful functioning of the surgery and are a major influence on patient satisfaction. However, they do so without formal recognition of the clinical implications of their work or with any requirements for training and qualifications. Our work consists of three phases. The first will survey receptionists using the validated Work Design Questionnaire to help us understand more precisely the parameters of their role; the second involves the use of iterative focus groups to help define the systems and processes within which they work. The third and final phase will produce recommendations to increase the efficiency and safety of the key practice processes involving receptionists and identify the areas and where receptionists require targeted support. In doing so, we aim to increase job satisfaction of receptionists, improve practice efficiency and produce better outcomes for patients. Our work will be disseminated using conferences, workshops, trade journals, electronic media and through a series of publications in the peer reviewed literature. At the very least, our work will serve to prompt discussion on the clinical role of receptionists and assess the advantages of using value streams in conjunction with related tools for process improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  19. Performance analysis of improved methodology for incorporation of spatial/spectral variability in synthetic hyperspectral imagery

    Science.gov (United States)

    Scanlan, Neil W.; Schott, John R.; Brown, Scott D.

    2004-01-01

    Synthetic imagery has traditionally been used to support sensor design by enabling design engineers to pre-evaluate image products during the design and development stages. Increasingly exploitation analysts are looking to synthetic imagery as a way to develop and test exploitation algorithms before image data are available from new sensors. Even when sensors are available, synthetic imagery can significantly aid in algorithm development by providing a wide range of "ground truthed" images with varying illumination, atmospheric, viewing and scene conditions. One limitation of synthetic data is that the background variability is often too bland. It does not exhibit the spatial and spectral variability present in real data. In this work, four fundamentally different texture modeling algorithms will first be implemented as necessary into the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model environment. Two of the models to be tested are variants of a statistical Z-Score selection model, while the remaining two involve a texture synthesis and a spectral end-member fractional abundance map approach, respectively. A detailed comparative performance analysis of each model will then be carried out on several texturally significant regions of the resultant synthetic hyperspectral imagery. The quantitative assessment of each model will utilize a set of three peformance metrics that have been derived from spatial Gray Level Co-Occurrence Matrix (GLCM) analysis, hyperspectral Signal-to-Clutter Ratio (SCR) measures, and a new concept termed the Spectral Co-Occurrence Matrix (SCM) metric which permits the simultaneous measurement of spatial and spectral texture. Previous research efforts on the validation and performance analysis of texture characterization models have been largely qualitative in nature based on conducting visual inspections of synthetic textures in order to judge the degree of similarity to the original sample texture imagery. The quantitative

  20. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    Science.gov (United States)

    Jonny; Nasution, Januar

    2013-06-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  1. Methodological and theoretical improvements in the study of superstitious beliefs and behaviour.

    Science.gov (United States)

    Fluke, Scott M; Webster, Russell J; Saucier, Donald A

    2014-02-01

    Via four studies (N = 901), we developed an improved Belief in Superstition Scale (BSS) composed of three distinct components (belief in bad luck, belief in good luck, and the belief that luck can be changed), whose structure was supported through exploratory (Study 1) and confirmatory (Studies 2 and 3) factor analyses using divergent samples. We found that among theoretical predictors, higher 'chance' locus of control (i.e., the belief that chance/fate controls one's life) best predicted all three BSS subscales (Studies 2-3). In Study 3, we found that BSS subscale scores were reliable, but largely invariant across age and education with a non-general psychology sample. In Study 4, the BSS subscales best predicted participants' superstitious attitudes and behaviour in a new lottery drawing paradigm among other commonly used superstition scales. Taken together, our results indicate that the BSS is a valuable addition to the burgeoning research on superstitious attitudes and behaviour. © 2012 The British Psychological Society.

  2. Organisms for biofuel production: natural bioresources and methodologies for improving their biosynthetic potentials.

    Science.gov (United States)

    Hu, Guangrong; Ji, Shiqi; Yu, Yanchong; Wang, Shi'an; Zhou, Gongke; Li, Fuli

    2015-01-01

    In order to relieve the pressure of energy supply and environment contamination that humans are facing, there are now intensive worldwide efforts to explore natural bioresources for production of energy storage compounds, such as lipids, alcohols, hydrocarbons, and polysaccharides. Around the world, many plants have been evaluated and developed as feedstock for bioenergy production, among which several crops have successfully achieved industrialization. Microalgae are another group of photosynthetic autotroph of interest due to their superior growth rates, relatively high photosynthetic conversion efficiencies, and vast metabolic capabilities. Heterotrophic microorganisms, such as yeast and bacteria, can utilize carbohydrates from lignocellulosic biomass directly or after pretreatment and enzymatic hydrolysis to produce liquid biofuels such as ethanol and butanol. Although finding a suitable organism for biofuel production is not easy, many naturally occurring organisms with good traits have recently been obtained. This review mainly focuses on the new organism resources discovered in the last 5 years for production of transport fuels (biodiesel, gasoline, jet fuel, and alkanes) and hydrogen, and available methods to improve natural organisms as platforms for the production of biofuels.

  3. Use of PFMEA methodology as a competitive advantage for the analysis of improvements in an experimental procedure

    Directory of Open Access Journals (Sweden)

    Fernando Coelho

    2015-12-01

    Full Text Available The methodology of Failure Modes and Effects Analysis (FMEA, utilized by industries to investigate potential failures, contributes to ensuring the robustness of the project and the manufacturing process, even before production starts. Thus, there is a reduced likelihood of errors, and a higher level of efficiency and effectiveness at high productivity. This occurs through the elimination or reduction of productive problems. In this context, this study is based on the structured application of PFMEA (Process Failure Mode Effects Analysis, associated with other quality tools, in a simulation of the assembly of an electro-pneumatic system. This study was performed at the Experimental Laboratory of the Botucatu Technology Faculty (FATEC, with the support of five undergraduate students from the Technology Industrial Production Course. The methodology applied contributed to the forecast of 24 potential failures and improvements opportunities, investigation of their causes, proving to be a standard that is applicable to any productive process with a gain in efficiency and effectiveness. Therefore, the final strategy was to evaluate and minimize the potential failures, to reduce production costs and to increase the performance of the process.

  4. Nutrients interaction investigation to improve Monascus purpureus FTC5391 growth rate using Response Surface Methodology and Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Mohamad, R.

    2013-01-01

    Full Text Available Aims: Two vital factors, certain environmental conditions and nutrients as a source of energy are entailed for successful growth and reproduction of microorganisms. Manipulation of nutritional requirement is the simplest and most effectual strategy to stimulate and enhance the activity of microorganisms. Methodology and Results: In this study, response surface methodology (RSM and artificial neural network (ANN were employed to optimize the carbon and nitrogen sources in order to improve growth rate of Monascus purpureus FTC5391,a new local isolate. The best models for optimization of growth rate were a multilayer full feed-forward incremental back propagation network, and a modified response surface model using backward elimination. The optimum condition for cell mass production was: sucrose 2.5%, yeast extract 0.045%, casamino acid 0.275%, sodium nitrate 0.48%, potato starch 0.045%, dextrose 1%, potassium nitrate 0.57%. The experimental cell mass production using this optimal condition was 21 mg/plate/12days, which was 2.2-fold higher than the standard condition (sucrose 5%, yeast extract 0.15%, casamino acid 0.25%, sodium nitrate 0.3%, potato starch 0.2%, dextrose 1%, potassium nitrate 0.3%. Conclusion, significance and impact of study: The results of RSM and ANN showed that all carbon and nitrogen sources tested had significant effect on growth rate (P-value < 0.05. In addition the use of RSM and ANN alongside each other provided a proper growth prediction model.

  5. Significant improvements in long trace profiler measurement performance

    Energy Technology Data Exchange (ETDEWEB)

    Takacs, P.Z. [Brookhaven National Lab., Upton, NY (United States); Bresloff, C.J. [Argonne National Lab., IL (United States)

    1996-07-01

    A Modifications made to the Long Trace Profiler (LTP II) system at the Advanced Photon Source at Argonne National Laboratory have significantly improved the accuracy and repeatability of the instrument The use of a Dove prism in the reference beam path corrects for phasing problems between mechanical efforts and thermally-induced system errors. A single reference correction now completely removes both error signals from the measured surface profile. The addition of a precision air conditioner keeps the temperature in the metrology enclosure constant to within {+-}0.1{degrees}C over a 24 hour period and has significantly improved the stability and repeatability of the system. We illustrate the performance improvements with several sets of measurements. The improved environmental control has reduced thermal drift error to about 0.75 microradian RMS over a 7.5 hour time period. Measurements made in the forward scan direction and the reverse scan direction differ by only about 0.5 microradian RMS over a 500mm, trace length. We are now able to put 1-sigma error bar of 0.3 microradian on an average of 10 slope profile measurements over a 500mm long trace length, and we are now able to put a 0.2 microradian error bar on an average of 10 measurements over a 200mm trace length. The corresponding 1-sigma height error bar for this measurement is 1.1 run.

  6. Biodosimetry for dose assessment of partial-body exposure: a methodological improvement

    Directory of Open Access Journals (Sweden)

    Thiago Salazar Fernandes

    2008-12-01

    Full Text Available This study has explored the possibility of combining culture times with extending the duration for which Colcemid is present in cell culture in order to obtain better dose estimations following partial-body exposures. Irradiated and unirradiated blood was mixed to simulate a partial-exposure. Dicentric frequencies and resultant dose estimations were compared from 48 and 72 h cultures with Colcemid added at the beginning, after 24 h or for the final 3 h. The frequencies of dicentrics in first division cells increased with the cell culture time, providing better dose estimations. Unwanted excessive contraction of chromosomes caused by prolonged contact with Colcemid was measured and ways to avoid this are discussed. It is suggested that the combination of a lower than usual concentration of this drug combined with its earlier addition and longer culture time may provide metaphases better suited for interpreting partial-body exposures.Este trabalho avaliou a estimativa da dose de radiação simulando uma exposição parcial do corpo através da irradiação in vitro de amostras de sangue misturadas com amostras não irradiadas. Foi observado que o prolongamento do tempo de cultura permite que a real fração de linfócitos em M1 contendo aberrações cromossômicas seja detectada, propiciando melhores estimativas de dose, sem a necessidade de correções matemáticas.

  7. Neurolinguistic measures of typological effects in multilingual transfer: Introducing an ERP methodology

    Directory of Open Access Journals (Sweden)

    Jason eRothman

    2015-08-01

    Full Text Available This article has two main objectives. First, we offer an introduction to the subfield of generative third language (L3 acquisition. Concerned primarily with modeling initial stages transfer of morphosyntax, one goal of this program is to show how initial stages L3 data make significant contributions towards a better understanding of how the mind represents language and how (cognitive economy constrains acquisition processes more generally. Our second objective is to argue for and demonstrate how this subfield will benefit from a neuro/psycholinguistic methodological approach, such as event related potential (ERP experiments, to complement the claims currently made on the basis of exclusively behavioral experiments.

  8. Neurolinguistic measures of typological effects in multilingual transfer: introducing an ERP methodology.

    Science.gov (United States)

    Rothman, Jason; Alemán Bañón, José; González Alonso, Jorge

    2015-01-01

    This article has two main objectives. First, we offer an introduction to the subfield of generative third language (L3) acquisition. Concerned primarily with modeling initial stages transfer of morphosyntax, one goal of this program is to show how initial stages L3 data make significant contributions toward a better understanding of how the mind represents language and how (cognitive) economy constrains acquisition processes more generally. Our second objective is to argue for and demonstrate how this subfield will benefit from a neuro/psycholinguistic methodological approach, such as event-related potential experiments, to complement the claims currently made on the basis of exclusively behavioral experiments.

  9. Neurolinguistic measures of typological effects in multilingual transfer: introducing an ERP methodology

    Science.gov (United States)

    Rothman, Jason; Alemán Bañón, José; González Alonso, Jorge

    2015-01-01

    This article has two main objectives. First, we offer an introduction to the subfield of generative third language (L3) acquisition. Concerned primarily with modeling initial stages transfer of morphosyntax, one goal of this program is to show how initial stages L3 data make significant contributions toward a better understanding of how the mind represents language and how (cognitive) economy constrains acquisition processes more generally. Our second objective is to argue for and demonstrate how this subfield will benefit from a neuro/psycholinguistic methodological approach, such as event-related potential experiments, to complement the claims currently made on the basis of exclusively behavioral experiments. PMID:26300800

  10. Measures assessing spirituality as more than religiosity: a methodological review of nursing and health-related literature.

    Science.gov (United States)

    Sessanna, Loralee; Finnell, Deborah S; Underhill, Meghan; Chang, Yu-Ping; Peng, Hsi-Ling

    2011-08-01

    This paper is a report of a methodological review conducted to analyse, evaluate and synthesize the rigour of measures found in nursing and health-related literature used to assess and evaluate patient spirituality as more than religiosity. Holistic healthcare practitioners recognize important distinctions exist about what constitutes spiritual care needs and preferences and what constitutes religious care needs and preferences in patient care practice. Databases searched, limited to the years 1982 and 2009, included AMED, Alt Health Watch, CINAHL Plus with Full Text, EBSCO Host, EBSCO Host Religion and Philosophy, ERIC, Google Scholar, HAPI, HUBNET, IngentaConnect, Mental Measurements Yearbook Online, Ovid MEDLINE, Social Work Abstracts and Hill and Hood's Measures of Religiosity text. A methodological review was carried out. Measures assessing spirituality as more than religiosity were critically reviewed including quality appraisal, relevant data extraction and a narrative synthesis of findings. Ten measures fitting inclusion criteria were included in the review. Despite agreement among nursing and health-related disciplines that spirituality and religiosity are distinct and diverse concepts, the concept of spirituality was often used interchangeably with the concept religion to assess and evaluate patient spirituality. The term spiritual or spirituality was used in a preponderance of items to assess or evaluate spirituality. Measures differentiating spirituality from religiosity are grossly lacking in nursing and health-related literature. © 2011 Blackwell Publishing Ltd.

  11. Service Productivity How to Measure and Improve It?

    Science.gov (United States)

    den Hartigh, Erik; Zegveld, Marc

    Productivity is a key performance measure for service businesses and serves as a compass for measuring their innovativeness. In this chapter we present a tool for measuring productivity in service businesses. Improvements in service business productivity do not depend on industry, business size or business growth, but on the specific knowledge and competences of managers. Using case examples we show various ways of how managers can improve the productivity of their service businesses. They can do so by adopting a perspective of standardization, flexibility or individualization. Based on these perspectives, we provide a framework that shows how managers can improve service business productivity by coordinating strategic orientation, value creation and the configuration of business processes.

  12. Methodology of the Auditing Measures to Civil Airport Security and Protection

    Directory of Open Access Journals (Sweden)

    Ján Kolesár

    2016-10-01

    Full Text Available Airports similarly to other companies are certified in compliance with the International Standardization Organization (ISO standards of products and services (series of ISO 9000 Standards regarding quality management, to coordinate the technical side of standardizatioon and normalization at an international scale. In order for the airports to meet the norms and the certification requirements as by the ISO they are liable to undergo strict audits of quality, as a rule, conducted by an independent auditing organization. Focus of the audits is primarily on airport operation economics and security. The article is an analysis into the methodology of the airport security audit processes and activities. Within the framework of planning, the sequence of steps is described in line with the principles and procedures of the Security Management System (SMS and starndards established by the International Standardization Organization (ISO. The methodology of conducting airport security audit is developed in compliance with the national programme and international legislation standards (Annex 17 applicable to protection of civil aviation against acts of unlawful interference.

  13. Addressing the “It Is Just Placebo” Pitfall in CAM: Methodology of a Project to Develop Patient-Reported Measures of Nonspecific Factors in Healing

    Directory of Open Access Journals (Sweden)

    Carol M. Greco

    2013-01-01

    Full Text Available CAM therapies are often dismissed as “no better than placebo;” however, this belief may be overcome through careful analysis of nonspecific factors in healing. To improve trial methodology, we propose that CAM (and conventional RCTs should evaluate and adjust for the effects of intrapersonal, interpersonal, and environmental factors on outcomes. However, measurement of these is challenging, and there are no brief, precise instruments that are suitable for widespread use in trials and clinical settings. This paper describes the methodology of a project to develop a set of patient-reported instruments that will quantify the nonspecific or “placebo” effects that are in fact specific and active ingredients in healing. The project uses the rigorous instrument-development methodology of the NIH-PROMIS initiative. The methods include (1 integration of patients’ and clinicians’ opinions with existing literature; (2 development of relevant items; (3 calibration of items on large samples; (4 classical test theory and modern psychometric methods to select the most useful items; (5 development of computerized adaptive tests (CATs that maximize information while minimizing patient burden; and (6 initial validation studies. The instruments will have the potential to revolutionize clinical trials in both CAM and conventional medicine through quantifying contextual factors that contribute to healing.

  14. Measurements methodology for evaluation of Digital TV operation in VHF high-band

    Science.gov (United States)

    Pudwell Chaves de Almeida, M.; Vladimir Gonzalez Castellanos, P.; Alfredo Cal Braz, J.; Pereira David, R.; Saboia Lima de Souza, R.; Pereira da Soledade, A.; Rodrigues Nascimento Junior, J.; Ferreira Lima, F.

    2016-07-01

    This paper describes the experimental setup of field measurements carried out for evaluating the operation of the ISDB-TB (Integrated Services Digital Broadcasting, Terrestrial, Brazilian version) standard digital TV in the VHF-highband. Measurements were performed in urban and suburban areas in a medium-sized Brazilian city. Besides the direct measurements of received power and environmental noise, a measurement procedure involving the injection of Gaussian additive noise was employed to achieve the signal to noise ratio threshold at each measurement site. The analysis includes results of static reception measurements for evaluating the received field strength and the signal to noise ratio thresholds for correct signal decoding.

  15. Rapid core measure improvement through a "business case for quality".

    Science.gov (United States)

    Perlin, Jonathan B; Horner, Stephen J; Englebright, Jane D; Bracken, Richard M

    2014-01-01

    Incentives to improve performance are emerging as revenue or financial penalties are linked to the measured quality of service provided. The HCA "Getting to Green" program was designed to rapidly increase core measure performance scores. Program components included (1) the "business case for quality"-increased awareness of how quality drives financial performance; (2) continuous communication of clinical and financial performance data; and (3) evidence-based clinical protocols, incentives, and tools for process improvement. Improvement was measured by comparing systemwide rates of adherence to national quality measures for heart failure (HF), acute myocardial infarction (AMI), pneumonia (PN), and surgical care (SCIP) to rates from all facilities reporting to the Centers for Medicare and Medicaid Services (CMS). As of the second quarter of 2011, 70% of HCA total measure set composite scores were at or above the 90th percentile of CMS scores. A test of differences in regression coefficients between the CMS national average and the HCA average revealed significant differences for AMI (p = .001), HF (p = .012), PN (p < .001), and SCIP (p = .015). This program demonstrated that presentation of the financial implications of quality, transparency in performance data, and clearly defined goals could cultivate the desire to use improvement tools and resources to raise performance. © 2012 National Association for Healthcare Quality.

  16. OMERACT: An international initiative to improve outcome measurement in rheumatology

    Directory of Open Access Journals (Sweden)

    Simon Lee

    2007-11-01

    Full Text Available Abstract OMERACT is the acronym for an international, informally organized network initiated in 1992 aimed at improving outcome measurement in rheumatology. Chaired by an executive committee, it organizes consensus conferences in a 2-yearly cycle that circles the globe. Data driven recommendations are prepared and updated by expert working groups. Recommendations include core sets of measures for most of the major rheumatologic conditions. Since 2002 patients have been actively engaged in the process.

  17. Towards an Improved Performance Measure for Language Models

    CERN Document Server

    Ueberla, J P

    1997-01-01

    In this paper a first attempt at deriving an improved performance measure for language models, the probability ratio measure (PRM) is described. In a proof of concept experiment, it is shown that PRM correlates better with recognition accuracy and can lead to better recognition results when used as the optimisation criterion of a clustering algorithm. Inspite of the approximations and limitations of this preliminary work, the results are very encouraging and should justify more work along the same lines.

  18. Measurement of improved pressure dependence of superconducting transition temperature

    Science.gov (United States)

    Karmakar, S.

    2013-06-01

    We describe a technique for making electrical transport measurements in a diamond anvil cell at liquid helium temperature having in situ pressure measurement option, permitting accurate pressure determination at any low temperature during the resistance measurement scan. In general, for four-probe resistivity measurements on a polycrystalline sample, four fine gold wires are kept in contact with the sample with the help of the compression from the soft solid (usually alkali halides such as NaCl, KCl, etc.) acting as a pressure-transmitting medium. The actual pressure on the sample is underestimated if not measured from a ruby sphere placed adjacent to the sample and at that very low temperature. Here, we demonstrate the technique with a quasi-four-probe resistance measurement on an Fe-based superconductor in the temperature range 1.2-300 K and pressures up to 8 GPa to find an improved pressure dependence of the superconducting transition temperature.

  19. [Techniques for measuring phakic and pseudophakic accommodation. Methodology for distinguishing between neurological and mechanical accommodative insufficiency].

    Science.gov (United States)

    Roche, O; Roumes, C; Parsa, C

    2007-11-01

    The methods available for studying accommodation are evaluated: Donder's "push-up" method, dynamic retinoscopy, infrared optometry using the Scheiner principle, and wavefront analysis are each discussed with their inherent advantages and limitations. Based on the methodology described, one can also distinguish between causes of accommodative insufficiency. Dioptric insufficiency (accommodative lag) that remains equal at various testing distances from the subject indicates a sensory/neurologic (afferent), defect, whereas accommodative insufficiency changing with distance indicates a mechanical/restrictive (efferent) defect, such as in presbyopia. Determining accommodative insufficiency and the cause can be particularly useful when examining patients with a variety of diseases associated with reduced accommodative ability (e.g., Down syndrome and cerebral palsy) as well as in evaluating the effectiveness of various potentially accommodating intraocular lens designs.

  20. Measuring instructional congruence in elementary science classrooms: Pedagogical and methodological components of a theoretical framework

    Science.gov (United States)

    Luykx, Aurolyn; Lee, Okhee

    2007-03-01

    This article is situated within a theoretical framework, instructional congruence, articulating issues of student diversity with the demands of academic disciplines. In the context of a large-scale study targeting elementary school science, the article describes a research instrument that aims to combine the strengths of both quantitative and qualitative approaches to classroom data. The project-developed classroom observation guideline is a series of detailed scales that produce numerical ratings based on qualitative observations of different aspects of classroom practice. The article's objectives are both pedagogical and methodological, reflecting the dual functionality of the instrument: (a) to concretize theoretical constructs articulating academic disciplines with student diversity in ways that are useful for rethinking classroom practice; and (b) to take advantage of the strengths of qualitative educational research, but within a quantitative analytical framework that may be applied across large numbers of classrooms.

  1. Measuring and improving patient safety through health information technology: The Health IT Safety Framework.

    Science.gov (United States)

    Singh, Hardeep; Sittig, Dean F

    2016-04-01

    Health information technology (health IT) has potential to improve patient safety but its implementation and use has led to unintended consequences and new safety concerns. A key challenge to improving safety in health IT-enabled healthcare systems is to develop valid, feasible strategies to measure safety concerns at the intersection of health IT and patient safety. In response to the fundamental conceptual and methodological gaps related to both defining and measuring health IT-related patient safety, we propose a new framework, the Health IT Safety (HITS) measurement framework, to provide a conceptual foundation for health IT-related patient safety measurement, monitoring, and improvement. The HITS framework follows both Continuous Quality Improvement (CQI) and sociotechnical approaches and calls for new measures and measurement activities to address safety concerns in three related domains: 1) concerns that are unique and specific to technology (e.g., to address unsafe health IT related to unavailable or malfunctioning hardware or software); 2) concerns created by the failure to use health IT appropriately or by misuse of health IT (e.g. to reduce nuisance alerts in the electronic health record (EHR)), and 3) the use of health IT to monitor risks, health care processes and outcomes and identify potential safety concerns before they can harm patients (e.g. use EHR-based algorithms to identify patients at risk for medication errors or care delays). The framework proposes to integrate both retrospective and prospective measurement of HIT safety with an organization's existing clinical risk management and safety programs. It aims to facilitate organizational learning, comprehensive 360 degree assessment of HIT safety that includes vendor involvement, refinement of measurement tools and strategies, and shared responsibility to identify problems and implement solutions. A long term framework goal is to enable rigorous measurement that helps achieve the safety

  2. Improvements in Elimination of Loudspeaker Distortion in Acoustic Measurements

    DEFF Research Database (Denmark)

    Agerkvist, Finn T.; Torras Rosell, Antoni; McWalter, Richard Ian

    2015-01-01

    sine signal and is tested on models of memoryless nonlinear systems as well as nonlinear loudspeakers. The method is shown to give a clear benefit over existing methods. Two techniques that improve the signal-to-noise ratio are demonstrated: the first uses more measurement levels than the number...

  3. Improvements in the optics measurement resolution for the LHC

    CERN Document Server

    Langner, A

    2014-01-01

    Optics measurement algorithms which are based on the measurement of beam position monitor (BPM) turn-by-turn data are currently being improved in preparation for the commissioning of the LHC at higher energy. The turn-by-turn data of one BPM may be used more than once, but the implied correlations were not considered in the final error bar. In this paper the error propagation including correlations is studied for the statistical part of the uncertainty. The confidence level of the measurement is investigated analytically and with simulations.

  4. Measures to improve the quality of hotel services

    Directory of Open Access Journals (Sweden)

    Anca MADAR

    2017-07-01

    Full Text Available This article aims to exemplify how, starting from the evaluation of customer satisfaction on service quality, the hotel units’ management, can apply different measures and strategies to improve it. To achieve the target, a marketing research survey is conducted based on a sample of 120 customers of Hotel „Kronwell” at the end of 2013. After analysing customer’ responses a series of measures have been taken to improve the quality of services offered by this hotel, then at the end of 2015 a new research was achieved, based on the same questionnaire. The results of this research highlight the increasing of customer satisfaction as a result of improving the quality of hotel services, supported by growth in net profit, turnover and decrease of employees’ number.

  5. Improving atlas methodology

    Science.gov (United States)

    Robbins, C.S.; Dowell, B.A.; O'Brien, J.

    1987-01-01

    We are studying a sample of Maryland (2 %) and New Hampshire (4 %) Atlas blocks and a small sample in Maine. These three States used different sampling methods and block sizes. We compare sampling techniques, roadside with off-road coverage, our coverage with that of the volunteers, and different methods of quantifying Atlas results. The 7 1/2' (12-km) blocks used in the Maine Atlas are satisfactory for coarse mapping, but are too large to enable changes to be detected in the future. Most states are subdividing the standard 7 1/2' maps into six 5-km blocks. The random 1/6 sample of 5-km blocks used in New Hampshire, Vermont (published 1985), and many other states has the advantage of permitting detection of some changes in the future, but the disadvantage of leaving important habitats unsampled. The Maryland system of atlasing all 1,200 5-km blocks and covering one out of each six by quarterblocks (2 1/2-km) is far superior if enough observers can be found. A good compromise, not yet attempted, would be to Atlas a 1/6 random sample of 5-km blocks and also one other carefully selected (non-random) block on the same 7 1/2' map--the block that would include the best sample of habitats or elevations not in the random block. In our sample the second block raised the percentage of birds found from 86% of the birds recorded in the 7 1/2' quadrangle to 93%. It was helpful to list the expected species in each block and to revise this list annually. We estimate that 90-100 species could be found with intensive effort in most Maryland blocks; perhaps 95-105 in New Hampshire. It was also helpful to know which species were under-sampled so we could make a special effort to search for these. A total of 75 species per block (or 75% of the expected species in blocks with very restricted habitat diversity) is considered a practical and adequate goal in these States. When fewer than 60 species are found per block, a high proportion of the rarer species are missed, as well as some of the common ones. Similarity indices based on fewer than 60 species per block reflect coverage rather than habitat affinities. Atlas blocks that are ecologically similar should have similarity indices (S) of at least 0.80 to be considered adequately covered. S = 2C/(A + B), where C is the number of species in common and A and B are species totals for each of the two blocks being compared. A series of 15 13-minutes roadside stops yielded more species than 15 off-road stops, but off-road stops always had some species not detected at the roadside stops. A series of timed stops is an excellent way to map relative abundance if the stops are standardized with respect to time of day and weather, and the counts are made by observers of comparable ability. Efforts to estimate Atlas block populations in powers of 10 (as in the French Atlas) have not gained acceptance in U.S.A. Most observers feel unqualified to make estimates. An efficient way to Atlas a block is to make at least 3 early morning visits to 15 or more specific stops. Arrive in the block early enough to check for nocturnal species on at least two days; and after completing the specific stops, search the block for other species and for confirmations.

  6. Agro-designing: sustainability-driven, vision-oriented, problem preventing and knowledge-based methodology for improving farming systems sustainability

    OpenAIRE

    Znaor, Darko; Goewie, Eric

    1999-01-01

    ABSTRACT While classical research focuses to problem solving, design is a problem- prevention methodology, and is suitable for multi- and and interdisciplinary research teams with the vision of how to improve the agricultural sustainability. Since organic agriculture is based on the holistic approach and is also problem-prevention oriented in that it refrains from certain inputs and practices, design is an interesting methodology that could be applied more often in organic agriculture. ...

  7. A FRAMEWORK FOR MEASURING AND IMPROVING EFFICIENCY IN DISTRIBUTION CHANNELS

    Directory of Open Access Journals (Sweden)

    Milan Andrejić

    2016-06-01

    Full Text Available Distribution of products is largely conditioned by the efficiency of logistics processes. The efficient logistics processes provide loyal and satisfied customers, dominant position on the market and revenue. In this paper new approach for measuring and improving efficiency of logistics processes in distribution channel is proposed. Model based on the Principal Component Analysis – Data Envelopment Analysis approach evaluates efficiency of ordering, warehousing, packaging, inventory management and transport processes as well as distribution channel efficiency. Proposed approach also gives information about corrective actions for efficiency improvement. According results efficiency should be improved in several ways: information system improvement, failures decreasing, utilization increasing and output increasing. The results of proposed approach testing show great applicability of developed approach.

  8. Methodological challenges in measurements of functional ability in gerontological research. A review

    DEFF Research Database (Denmark)

    Avlund, Kirsten

    1997-01-01

    This article addresses two important challenges in the measurement of functional ability in gerontological research: the first challenge is to connect measurements to a theoretical frame of reference which enhances our understanding and interpretation of the collected data; the second relates...

  9. Measuring subjective meaning structures by the laddering method: Theoretical considerations and methodological problems

    DEFF Research Database (Denmark)

    Grunert, Klaus G.; Grunert, Suzanne C.

    1995-01-01

    Starting from a general model of measuring cognitive structures for predicting consumer behaviour, we discuss laddering as a possible method to obtain estimates of consumption-relevant cognitive structures which will have predictive validity. Four criteria for valid measurement are derived...

  10. Methodological aspects of blood pressure measurement and adherence to antihypertensive drug therapy

    NARCIS (Netherlands)

    Braam, R.L.

    2007-01-01

    Hypertension is an important risk-factor for cardiovascular disease. Accurate blood pressure measurements are very important to diagnose hypertension. Nowadays these blood pressure measurements are often performed using automatic devices. One can wonder whether these devices are accurate enough. In

  11. Methodological aspects of blood pressure measurement and adherence to antihypertensive drug therapy

    NARCIS (Netherlands)

    Braam, R.L.

    2007-01-01

    Hypertension is an important risk-factor for cardiovascular disease. Accurate blood pressure measurements are very important to diagnose hypertension. Nowadays these blood pressure measurements are often performed using automatic devices. One can wonder whether these devices are accurate enough. In

  12. The bounds on tracking performance utilising a laser-based linear and angular sensing and measurement methodology for micro/nano manipulation

    OpenAIRE

    Clark, Leon; Shirinzadeh, Bijan; Tian, Yanling; Zhong, Yongmin

    2014-01-01

    This paper presents an analysis of the tracking performance of a planar three degrees of freedom (DOF) flexure-based mechanism for micro/nano manipulation, utilising a tracking methodology for the measurement of coupled linear and angular motions. The methodology permits trajectories over a workspace with large angular range through the reduction of geometric errors. However, when combining this methodology with feedback control systems, the accuracy of performed manipulations can only be sta...

  13. A methodology to quantify the differences between alternative methods of heart rate variability measurement.

    Science.gov (United States)

    García-González, M A; Fernández-Chimeno, M; Guede-Fernández, F; Ferrer-Mileo, V; Argelagós-Palau, A; Álvarez-Gómez, L; Parrado, E; Moreno, J; Capdevila, L; Ramos-Castro, J

    2016-01-01

    This work proposes a systematic procedure to report the differences between heart rate variability time series obtained from alternative measurements reporting the spread and mean of the differences as well as the agreement between measuring procedures and quantifying how stationary, random and normal the differences between alternative measurements are. A description of the complete automatic procedure to obtain a differences time series (DTS) from two alternative methods, a proposal of a battery of statistical tests, and a set of statistical indicators to better describe the differences in RR interval estimation are also provided. Results show that the spread and agreement depend on the choice of alternative measurements and that the DTS cannot be considered generally as a white or as a normally distributed process. Nevertheless, in controlled measurements the DTS can be considered as a stationary process.

  14. Factors of psychological distress: clinical value, measurement substance, and methodological artefacts.

    Science.gov (United States)

    Böhnke, J R; Croudace, T J

    2015-04-01

    Psychometric models and statistical techniques are cornerstones of research into latent structures of specific psychopathology and general mental health. We discuss "pivot points" for future research efforts from a psychometric epidemiology perspective, emphasising sampling and selection processes of both indicators that guide data collection as well as samples that are confronted with them. First, we discuss how a theoretical model of psychopathology determines which empirical indicators (questions, diagnoses, etc.) and modelling methods are appropriate to test its implications. Second, we deal with how different research designs introduce different (co-)variances between indicators, potentially leading to a different understanding of latent structures. Third, we discuss widening the range of statistical models available within the "psychometrics class": the inclusion of categorical approaches can help to enlighten the debate on the structure of psychopathology and agreement on a minimal set of models might lead to greater convergence between studies. Fourth, we deal with aspects of methodology that introduce spurious (co-)variance in latent structure analysis (response styles, clustered data) and differential item functioning to gather more detailed information and to guard against over-generalisation of results, which renders assessments unfair. Building on established insights, future research efforts should be more explicit about their theoretical understanding of psychopathology and how the analysis of a given indicator-respondent set informs this theoretical model. A coherent treatment of theoretical assumptions, indicators, and samples holds the key to building a comprehensive account of the latent structures of different types of psychopathology and mental health in general.

  15. Comparative methodologies for measuring metabolizable energy of various types of resistant high amylose corn starch.

    Science.gov (United States)

    Tulley, Richard T; Appel, Marko J; Enos, Tanya G; Hegsted, Maren; McCutcheon, Kathleen L; Zhou, Jun; Raggio, Anne M; Jeffcoat, Roger; Birkett, Anne; Martin, Roy J; Keenan, Michael J

    2009-09-23

    Energy values of high amylose corn starches high in resistant starch (RS) were determined in vivo by two different methodologies. In one study, energy values were determined according to growth relative to glucose-based diets in rats fed diets containing RS(2), heat-treated RS(2) (RS(2)-HT), RS(3), and amylase predigested versions to isolate the RS component. Net metabolizable energy values ranged from 2.68 to 3.06 kcal/g for the RS starches, and 1.91-2.53 kcal/g for the amylase predigested versions. In a second study, rats were fed a diet containing RS(2)-HT and the metabolizable energy value was determined by bomb calorimetry. The metabolizable energy value was 2.80 kcal/g, consistent with Study 1. Thus, high amylose corn based RS ingredients and their amylase predigested equivalents have energy values approximately 65-78% and 47-62% of available starch (Atwater factor), respectively, according to the RS type (Garcia, T. A.; McCutcheon, K. L.; Francis, A. R.; Keenan, M. J.; O'Neil, C. E.; Martin, R. J.; Hegsted, M. The effects of resistant starch on gastrointestinal organs and fecal output in rats. FASEB J. 2003, 17, A335).

  16. A system and methodology for measuring volatile organic compounds produced by hydroponic lettuce in a controlled environment

    Science.gov (United States)

    Charron, C. S.; Cantliffe, D. J.; Wheeler, R. M.; Manukian, A.; Heath, R. R.

    1996-01-01

    A system and methodology were developed for the nondestructive qualitative and quantitative analysis of volatile emissions from hydroponically grown 'Waldmann's Green' leaf lettuce (Lactuca sativa L.). Photosynthetic photon flux (PPF), photoperiod, and temperature were automatically controlled and monitored in a growth chamber modified for the collection of plant volatiles. The lipoxygenase pathway products (Z)-3-hexenal, (Z)-3-hexenol, and (Z)-3-hexenyl acetate were emitted by lettuce plants after the transition from the light period to the dark period. The volatile collection system developed in this study enabled measurements of volatiles emitted by intact plants, from planting to harvest, under controlled environmental conditions.

  17. Efficient solution methodology for calibrating the hemodynamic model using functional Magnetic Resonance Imaging (fMRI) measurements

    KAUST Repository

    Zambri, Brian

    2015-11-05

    Our aim is to propose a numerical strategy for retrieving accurately and efficiently the biophysiological parameters as well as the external stimulus characteristics corresponding to the hemodynamic mathematical model that describes changes in blood flow and blood oxygenation during brain activation. The proposed method employs the TNM-CKF method developed in [1], but in a prediction/correction framework. We present numerical results using both real and synthetic functional Magnetic Resonance Imaging (fMRI) measurements to highlight the performance characteristics of this computational methodology. © 2015 IEEE.

  18. Research Recommendations for Improving Measurement of Treatment Effectiveness in Depression

    Science.gov (United States)

    Kamenov, Kaloyan; Cabello, María; Nieto, Mónica; Bernard, Renaldo; Kohls, Elisabeth; Rummel-Kluge, Christine; Ayuso-Mateos, José L.

    2017-01-01

    Background: Despite the steadily escalating psychological and economic burden of depression, there is a lack of evidence for the effectiveness of available interventions on functioning areas beyond symptomatology. Therefore, the main objective of this study was to give an insight into the current measurement of treatment effectiveness in depression and to provide recommendations for its improvement. Materials and Methods: The study was based on a multi-informant approach, comparing data from a systematic literature review, an expert survey with representatives from clinical practice (130), and qualitative interviews with patients (11) experiencing depression. Results: Current literature places emphasis on symptomatic outcomes and neglects other domains of functioning, whereas clinicians and depressed patients highlight the importance of both. Interpersonal relationships, recreation and daily activities, communication, social participation, work difficulties were identified as being crucial for recovery. Personal factors, neglected by the literature, such as self-efficacy were introduced by experts and patients. Furthermore, clinicians and patients identified a number of differences regarding the areas improved by psychotherapeutic or pharmacological interventions that were not addressed by the pertinent literature. Conclusion: Creation of a new cross-nationally applicable measure of psychosocial functioning, broader remission criteria, report of domain-specific information, and a personalized approach in treatment decision-making are the first crucial steps needed for the improvement of the measurement of treatment effectiveness in depression. A better measurement will facilitate the clinical decision making and answer the escalating burden of depression.

  19. Using Measured Plane-of-Array Data Directly in Photovoltaic Modeling: Methodology and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine; Freestate, David; Riley, Cameron; Hobbs, William

    2016-11-01

    Measured plane-of-array (POA) irradiance may provide a lower-cost alternative to standard irradiance component data for photovoltaic (PV) system performance modeling without loss of accuracy. Previous work has shown that transposition models typically used by PV models to calculate POA irradiance from horizontal data introduce error into the POA irradiance estimates, and that measured POA data can correlate better to measured performance data. However, popular PV modeling tools historically have not directly used input POA data. This paper introduces a new capability in NREL's System Advisor Model (SAM) to directly use POA data in PV modeling, and compares SAM results from both POA irradiance and irradiance components inputs against measured performance data for eight operating PV systems.

  20. Using Measured Plane-of-Array Data Directly in Photovoltaic Modeling: Methodology and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine; Freestate, David; Hobbs, William; Riley, Cameron

    2016-06-05

    Measured plane-of-array (POA) irradiance may provide a lower-cost alternative to standard irradiance component data for photovoltaic (PV) system performance modeling without loss of accuracy. Previous work has shown that transposition models typically used by PV models to calculate POA irradiance from horizontal data introduce error into the POA irradiance estimates, and that measured POA data can correlate better to measured performance data. However, popular PV modeling tools historically have not directly used input POA data. This paper introduces a new capability in NREL's System Advisor Model (SAM) to directly use POA data in PV modeling, and compares SAM results from both POA irradiance and irradiance components inputs against measured performance data for eight operating PV systems.

  1. Methodology of heat transfer and flow resistance measurement for matrices of rotating regenerative heat exchangers

    National Research Council Canada - National Science Library

    Dariusz Butrymowicz; Jarosław Karwacki; Roman Kwidziński; Kamil Śmierciew; Jerzy Gagan; Tomasz Przybyliński; Teodor Skiepko; Marek Łapin

    2016-01-01

    The theoretical basis for the indirect measurement approach of mean heat transfer coefficient for the packed bed based on the modified single blow technique was presented and discussed in the paper...

  2. Pilot testing a methodology to measure the marginal increase in economic impact of rural tourism sites

    Science.gov (United States)

    April Evans; Hans Vogelsong

    2008-01-01

    Rural tourism is a rapidly expanding industry which holds some promise of improving the economy in small towns and farming regions. However, rural communities have limited funding available for promotional efforts. To understand if limited funds are effective in producing the desired economic impacts, it is important that rural communities evaluate their promotional...

  3. Perspectives for clinical measures of dynamic foot function-reference data and methodological considerations.

    Science.gov (United States)

    Rathleff, M S; Nielsen, R G; Simonsen, O; Olesen, C G; Kersting, U G

    2010-02-01

    Several studies have investigated if static posture assessments qualify to predict dynamic function of the foot showing diverse outcomes. However, it was suggested that dynamic measures may be better suited to predict foot-related overuse problems. The purpose of this study was to establish the reliability for dynamic measures of longitudinal arch angle (LAA) and navicular height (NH) and to examine to what extent static and dynamic measures thereof are related. Intra-rater reliability of LAA and NH measures was tested on a sample of 17 control subjects. Subsequently, 79 subjects were tested while walking on a treadmill. The ranges and minimum values for LAA and NH during ground contact were identified over 20 consecutive steps. A geometric error model was used to simulate effects of marker placement uncertainty and skin movement artifacts. Results demonstrated the highest reliability for the minimum NH (MinNH), followed by the minimum LAA (MinLAA), the dynamic range of navicular height (DeltaNH) and the range of LAA (DeltaLAA) while all measures were highly reliable. Marker location uncertainty and skin movement artifacts had the smallest effects on measures of NH. The use of an alignment device for marker placement was shown to reduce error ranges for NH measures. Therefore, DeltaNH and MinNH were recommended for functional dynamic foot characterization in the sagittal plane. There is potential for such measures to be a suitable predictor for overuse injuries while being obtainable in clinical settings. Future research needs to include such dynamic but simple foot assessments in large-scale clinical studies.

  4. A Review of Research on Improvement and Optimization of Performance Measures for Electrical Discharge Machining

    Directory of Open Access Journals (Sweden)

    C. R. Sanghani

    2014-01-01

    Full Text Available Electrical Discharge Machining (EDM is a non conventional machining method which can be used to machine electrically conductive work pieces irrespective of their shape, hardness and toughness. High cost of non conventional machine tools, compared to conventional machining, has forced us to operate these machines as efficiently as possible in order to reduce production cost and to obtain the required reimbursement. To achieve this task, machining parameters such as pulse on time, pulse off time, discharge current, gap voltage, flushing pressure, electrode material, etc. of this process should be selected such that optimal value of their performance measures like Material Removal Rate (MRR, Surface Roughness (SR, Electrode/Tool Wear Rate (EWR/TWR, dimensional accuracy, etc. can be obtained or improved. In past decades, intensive research work had been carried out by different researchers for improvement and optimization of EDM performance measures using various optimization techniques like Taguchi, Response Surface Methodology (RSM, Artificial Neural Network (ANN, Genetic Algorithm (GA, etc. This paper reviews research on improvement and optimization of various performance measures of spark erosion EDM and finally lists down certain areas that can be taken up for further research in the field of improvement and optimization for EDM process.

  5. Contact Thermocouple Methodology and Evaluation for Temperature Measurement in the Laboratory

    Science.gov (United States)

    Brewer, Ethan J.; Pawlik, Ralph J.; Krause, David L.

    2013-01-01

    Laboratory testing of advanced aerospace components very often requires highly accurate temperature measurement and control devices, as well as methods to precisely analyze and predict the performance of such components. Analysis of test articles depends on accurate measurements of temperature across the specimen. Where possible, this task is accomplished using many thermocouples welded directly to the test specimen, which can produce results with great precision. However, it is known that thermocouple spot welds can initiate deleterious cracks in some materials, prohibiting the use of welded thermocouples. Such is the case for the nickel-based superalloy MarM-247, which is used in the high temperature, high pressure heater heads for the Advanced Stirling Converter component of the Advanced Stirling Radioisotope Generator space power system. To overcome this limitation, a method was developed that uses small diameter contact thermocouples to measure the temperature of heater head test articles with the same level of accuracy as welded thermocouples. This paper includes a brief introduction and a background describing the circumstances that compelled the development of the contact thermocouple measurement method. Next, the paper describes studies performed on contact thermocouple readings to determine the accuracy of results. It continues on to describe in detail the developed measurement method and the evaluation of results produced. A further study that evaluates the performance of different measurement output devices is also described. Finally, a brief conclusion and summary of results is provided.

  6. Comparison of efficiency of distance measurement methodologies in mango (Mangifera indica) progenies based on physicochemical descriptors.

    Science.gov (United States)

    Alves, E O S; Cerqueira-Silva, C B M; Souza, A M; Santos, C A F; Lima Neto, F P; Corrêa, R X

    2012-03-14

    We investigated seven distance measures in a set of observations of physicochemical variables of mango (Mangifera indica) submitted to multivariate analyses (distance, projection and grouping). To estimate the distance measurements, five mango progeny (total of 25 genotypes) were analyzed, using six fruit physicochemical descriptors (fruit weight, equatorial diameter, longitudinal diameter, total soluble solids in °Brix, total titratable acidity, and pH). The distance measurements were compared by the Spearman correlation test, projection in two-dimensional space and grouping efficiency. The Spearman correlation coefficients between the seven distance measurements were, except for the Mahalanobis' generalized distance (0.41 ≤ rs ≤ 0.63), high and significant (rs ≥ 0.91; P < 0.001). Regardless of the origin of the distance matrix, the unweighted pair group method with arithmetic mean grouping method proved to be the most adequate. The various distance measurements and grouping methods gave different values for distortion (-116.5 ≤ D ≤ 74.5), cophenetic correlation (0.26 ≤ rc ≤ 0.76) and stress (-1.9 ≤ S ≤ 58.9). Choice of distance measurement and analysis methods influence the.

  7. A Novel Instrument and Methodology for the In-Situ Measurement of the Stress in Thin Films

    Science.gov (United States)

    Broadway, David M.; Omokanwaye, Mayowa O.; Ramsey, Brian D.

    2014-01-01

    We introduce a novel methodology for the in-situ measurement of mechanical stress during thin film growth utilizing a highly sensitive non-contact variation of the classic spherometer. By exploiting the known spherical deformation of the substrate the value of the stress induced curvature is inferred by measurement of only one point on the substrate's surface-the sagittal. From the known curvature the stress can be calculated using the well-known Stoney equation. Based on this methodology, a stress sensor has been designed which is simple, highly sensitive, compact, and low cost. As a result of its compact nature, the sensor can be mounted in any orientation to accommodate a given deposition geometry without the need for extensive modification to an already existing deposition system. The technique employs the use of a double side polished substrate that offers good specular reflectivity and is isotropic in its mechanical properties, such as oriented crystalline silicon or amorphous soda lime glass, for example. The measurement of the displacement of the uncoated side during deposition is performed with a high resolution (i.e. 5nm), commercially available, inexpensive, fiber optic sensor which can be used in both high vacuum and high temperature environments (i.e. 10(exp-7) Torr and 480oC, respectively). A key attribute of this instrument lies in its potential to achieve sensitivity that rivals other measurement techniques such as the micro cantilever method but, due to the comparatively larger substrate area, offers a more robust and practical alternative for subsequent measurement of additional characteristics of the film that can might be correlated to film stress. We present measurement results of nickel films deposited by magnetron sputtering which show good qualitative agreement to the know behavior of polycrystalline films previously reported by Hoffman.

  8. Assessment of fetal maturation age by heart rate variability measures using random forest methodology.

    Science.gov (United States)

    Tetschke, F; Schneider, U; Schleussner, E; Witte, O W; Hoyer, D

    2016-03-01

    Fetal maturation age assessment based on heart rate variability (HRV) is a predestinated tool in prenatal diagnosis. To date, almost linear maturation characteristic curves are used in univariate and multivariate models. Models using complex multivariate maturation characteristic curves are pending. To address this problem, we use Random Forest (RF) to assess fetal maturation age and compare RF with linear, multivariate age regression. We include previously developed HRV indices such as traditional time and frequency domain indices and complexity indices of multiple scales. We found that fetal maturation was best assessed by complexity indices of short scales and skewness in state-dependent datasets (quiet sleep, active sleep) as well as in state-independent recordings. Additionally, increasing fluctuation amplitude contributed to the model in the active sleep state. None of the traditional linear HRV parameters contributed to the RF models. Compared to linear, multivariate regression, the mean prediction of gestational age (GA) is more accurate with RF than in linear, multivariate regression (quiet state: R(2)=0,617 vs. R(2)=0,461, active state: R(2)=0,521 vs. R(2)=0,436, state independent: R(2)=0,583 vs. R(2)=0,548). We conclude that classification and regression tree models such as RF methodology are appropriate for the evaluation of fetal maturation age. The decisive role of adjustments between different time scales of complexity may essentially extend previous analysis concepts mainly based on rhythms and univariate complexity indices. Those system characteristics may have implication for better understanding and accessibility of the maturating complex autonomic control and its disturbance.

  9. Towards a methodology for validation of centrality measures in complex networks.

    Directory of Open Access Journals (Sweden)

    Komal Batool

    Full Text Available BACKGROUND: Living systems are associated with Social networks - networks made up of nodes, some of which may be more important in various aspects as compared to others. While different quantitative measures labeled as "centralities" have previously been used in the network analysis community to find out influential nodes in a network, it is debatable how valid the centrality measures actually are. In other words, the research question that remains unanswered is: how exactly do these measures perform in the real world? So, as an example, if a centrality of a particular node identifies it to be important, is the node actually important? PURPOSE: The goal of this paper is not just to perform a traditional social network analysis but rather to evaluate different centrality measures by conducting an empirical study analyzing exactly how do network centralities correlate with data from published multidisciplinary network data sets. METHOD: We take standard published network data sets while using a random network to establish a baseline. These data sets included the Zachary's Karate Club network, dolphin social network and a neural network of nematode Caenorhabditis elegans. Each of the data sets was analyzed in terms of different centrality measures and compared with existing knowledge from associated published articles to review the role of each centrality measure in the determination of influential nodes. RESULTS: Our empirical analysis demonstrates that in the chosen network data sets, nodes which had a high Closeness Centrality also had a high Eccentricity Centrality. Likewise high Degree Centrality also correlated closely with a high Eigenvector Centrality. Whereas Betweenness Centrality varied according to network topology and did not demonstrate any noticeable pattern. In terms of identification of key nodes, we discovered that as compared with other centrality measures, Eigenvector and Eccentricity Centralities were better able to identify

  10. Towards a methodology for validation of centrality measures in complex networks.

    Science.gov (United States)

    Batool, Komal; Niazi, Muaz A

    2014-01-01

    Living systems are associated with Social networks - networks made up of nodes, some of which may be more important in various aspects as compared to others. While different quantitative measures labeled as "centralities" have previously been used in the network analysis community to find out influential nodes in a network, it is debatable how valid the centrality measures actually are. In other words, the research question that remains unanswered is: how exactly do these measures perform in the real world? So, as an example, if a centrality of a particular node identifies it to be important, is the node actually important? The goal of this paper is not just to perform a traditional social network analysis but rather to evaluate different centrality measures by conducting an empirical study analyzing exactly how do network centralities correlate with data from published multidisciplinary network data sets. We take standard published network data sets while using a random network to establish a baseline. These data sets included the Zachary's Karate Club network, dolphin social network and a neural network of nematode Caenorhabditis elegans. Each of the data sets was analyzed in terms of different centrality measures and compared with existing knowledge from associated published articles to review the role of each centrality measure in the determination of influential nodes. Our empirical analysis demonstrates that in the chosen network data sets, nodes which had a high Closeness Centrality also had a high Eccentricity Centrality. Likewise high Degree Centrality also correlated closely with a high Eigenvector Centrality. Whereas Betweenness Centrality varied according to network topology and did not demonstrate any noticeable pattern. In terms of identification of key nodes, we discovered that as compared with other centrality measures, Eigenvector and Eccentricity Centralities were better able to identify important nodes.

  11. Improvements to TITAN's mass measurement and decay spectroscopy capabilities

    Science.gov (United States)

    Lascar, D.; Kwiatkowski, A. A.; Alanssari, M.; Chowdhury, U.; Even, J.; Finlay, A.; Gallant, A. T.; Good, M.; Klawitter, R.; Kootte, B.; Li, T.; Leach, K. G.; Lennarz, A.; Leistenschneider, E.; Mayer, A. J.; Schultz, B. E.; Schupp, R.; Short, D. A.; Andreoiu, C.; Dilling, J.; Gwinner, G.

    2016-06-01

    The study of nuclei farther from the valley of β -stability than ever before goes hand-in-hand with shorter-lived nuclei produced in smaller abundances than their less exotic counterparts. The measurement, to high precision, of nuclear masses therefore requires innovations in technique in order to keep up. TRIUMF's Ion Trap for Atomic and Nuclear science (TITAN) facility deploys three ion traps, with a fourth in the commissioning phase, to perform and support Penning trap mass spectrometry and in-trap decay spectroscopy on some of the shortest-lived nuclei ever studied. We report on recent advances and updates to the TITAN facility since the 2012 EMIS conference. TITAN's charge breeding capabilities have been improved and in-trap decay spectroscopy can be performed in TITAN's Electron Beam Ion Trap (EBIT). Higher charge states can improve the precision of mass measurements, reduce the beam-time requirements for a given measurement, improve beam purity, and open the door to access isotopes not available from the ISOL method via in-trap decay and recapture. This was recently demonstrated during TITAN's mass measurement of 30 Al. The EBIT's decay spectroscopy setup was commissioned with a successful branching ratio and half-life measurement of 124 Cs. Charge breeding in the EBIT increases the energy spread of the ion bunch sent to the Penning trap for mass measurement, so a new Cooler PEnning Trap (CPET), which aims to cool highly charged ions with an electron plasma, is undergoing offline commissioning. Already CPET has demonstrated the trapping and self-cooling of a room-temperature electron plasma that was stored for several minutes. A new detector has been installed inside the CPET magnetic field which will allow for in-magnet charged particle detection.

  12. H/L ratio as a measurement of stress in laying hens - methodology and reliability.

    Science.gov (United States)

    Lentfer, T L; Pendl, H; Gebhardt-Henrich, S G; Fröhlich, E K F; Von Borell, E

    2015-04-01

    Measuring the ratio of heterophils and lymphocytes (H/L) in response to different stressors is a standard tool for assessing long-term stress in laying hens but detailed information on the reliability of measurements, measurement techniques and methods, and absolute cell counts is often lacking. Laying hens offered different sites of the nest boxes at different ages were compared in a two-treatment crossover experiment to provide detailed information on the procedure for measuring and the difficulties in the interpretation of H/L ratios in commercial conditions. H/L ratios were pen-specific and depended on the age and aviary system. There was no effect for the position of the nest. Heterophiles and lymphocytes were not correlated within individuals. Absolute cell counts differed in the number of heterophiles and lymphocytes and H/L ratios, whereas absolute leucocyte counts between individuals were similar. The reliability of the method using relative cell counts was good, yielding a correlation coefficient between double counts of r > 0.9. It was concluded that population-based reference values may not be sensitive enough to detect individual stress reactions and that the H/L ratio as an indicator of stress under commercial conditions may not be useful because of confounding factors and that other, non-invasive, measurements should be adopted.

  13. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    Science.gov (United States)

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  14. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    Science.gov (United States)

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  15. Improved Measurement of Electron-antineutrino Disappearance at Daya Bay

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, D.A. [Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2013-02-15

    With 2.5× the previously reported exposure, the Daya Bay experiment has improved the measurement of the neutrino mixing parameter sin{sup 2}2θ{sub 13}=0.089±0.010(stat)±0.005(syst). Reactor anti-neutrinos were produced by six 2.9 GW{sub th} commercial power reactors, and measured by six 20-ton target-mass detectors of identical design. A total of 234,217 anti-neutrino candidates were detected in 127 days of exposure. An anti-neutrino rate of 0.944±0.007(stat)±0.003(syst) was measured by three detectors at a flux-weighted average distance of1648 m from the reactors, relative to two detectors at 470 m and one detector at 576 m. Detector design and depth underground limited the background to 5±0.3% (far detectors) and 2±0.2% (near detectors) of the candidate signals. The improved precision confirms the initial measurement of reactor anti-neutrino disappearance, and continues to be the most precise measurement of θ{sub 13}.

  16. Improved Reconstruction of Dipole Directions from Spherical Magnetic Field Measurements

    CERN Document Server

    Gerhards, Christian

    2016-01-01

    Reconstructing magnetizations from measurements of the generated magnetic potential is highly non-unique. The matter of uniqueness can be improved, but not entirely resolved, by the assumption that the magnetization is locally supported. Here, we focus on the case that the magnetization is additionally assumed to be induced by an ambient magnetic dipole field, i.e., the task is to reconstruct the dipole direction as well as the susceptibility of the magnetic material. We investigate uniqueness issues and provide a reconstruction procedure from given magnetic potential measurements on a spherical surface.

  17. Measurements of air kerma index in computed tomography: a comparison among methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, T. C.; Mourao, A. P.; Da Silva, T. A., E-mail: alonso@cdtn.br [Universidade Federal de Minas Gerais, Programa de Ciencia y Tecnicas Nucleares, Av. Pres. Antonio Carlos 6627, Pampulha, 31270-901 Belo Horizonte, Minas Gerais (Brazil)

    2016-10-15

    Computed tomography (CT) has become the most important and widely used technique for diagnosis purpose. As CT exams impart high doses to patients in comparison to other radiologist techniques, reliable dosimetry is required. Dosimetry in CT is done in terms of air kerma index in air or in a phantom measured by a pencil ionization chamber under a single X-ray tube rotation. In this work, a comparison among CT dosimetric quantities measured by an UNFORS pencil ionization chamber, MTS-N RADOS thermoluminescent dosimeters and GAFCHROMIC XR-CT radiochromic film was done. The three dosimetric systems were properly calibrated in X-ray reference radiations in a calibration laboratory. CT dosimetric quantities were measured in CT Bright Speed GE Medical Systems Inc., scanner in a PMMA trunk phantom and a comparison among the three dosimetric techniques was done. (Author)

  18. Methodologies of Measuring Mechanical Power Delivered at the Shaft of an Induction Motor Driven by VFD

    Directory of Open Access Journals (Sweden)

    Mariana MANEA

    2012-12-01

    Full Text Available Measuring precise power used by a load of an induction motor driven by a VFD implies a few facts that need to be considered. First, the real electric power. When dealing with waveforms of electric current that contain harmonics, traditional methods of power measuring could lead to inaccurate results. Therefore, further investigation needs to be performed in order to provide meaningful values. Then there is the efficiency. Motor losses are to be taken into account for finding out exactly how much power is being used for a specific application.This paper shows a method of measuring and calculating the electric real power of fundamental harmonic and of extracting an actual output value of mechanical power at the motor shaft. For this purpose we used a data acquisition system made of a basic power quality analyzer and data acquisition software. Harmonic analysis of the waveforms is considered, combined with the use of the true power factor.

  19. The development and validation of a new multidimensional Job Insecurity Measure (JIM): an inductive methodology.

    Science.gov (United States)

    O'Neill, Patrick; Sevastos, Peter

    2013-07-01

    This study outlines the development and validation of a new 4-dimensional job insecurity measure (JIM). Items were generated from interviews with Australian employees facing an objective threat of job loss. The measure was then validated on a North American sample of 1,004 respondents. Exploratory and confirmatory factor analyses (EFA and CFA) with tests of sample invariance supported an 18-item measurement model consisting of 4 correlated but distinct subscales of insecurity: Job Loss, Job Changes, Marginalization, and Organizational Survival. The results indicate convergent and discriminant validity as well as high internal consistency for the instrument. Significant associations with psychological well-being (job related affective well-being, job satisfaction) and organizational attitudes (organizational commitment, trust in management, intention to resign) established the criterion-related validity of each subscale. These findings support the use of the instrument in academic and applied settings.

  20. Radio Weak Lensing Shear Measurement in the Visibility Domain - I. Methodology

    CERN Document Server

    Rivi, Marzia; Makhathini, Sphesihle; Abdalla, Filipe Batoni

    2016-01-01

    The high sensitivity of the new generation of radio telescopes such as the Square Kilometre Array (SKA) will allow cosmological weak lensing measurements at radio wavelengths that are competitive with optical surveys. We present an adaptation to radio data of "lensfit", a method for galaxy shape measurement originally developed and used for optical weak lensing surveys. This likelihood method uses an analytical galaxy model and makes a Bayesian marginalisation of the likelihood over uninteresting parameters. It has the feature of working directly in the visibility domain, which is the natural approach to adopt with radio interferometer data, avoiding systematics introduced by the imaging process. As a proof of concept, we provide results for visibility simulations of individual galaxies with flux density S >= 10muJy at the phase centre of the proposed SKA1-MID baseline configuration, adopting 12 frequency channels in the band 950-1190 MHz. Weak lensing shear measurements from a population of galaxies with rea...

  1. Methodologically controlled variations in laboratory and field pH measurements in waterlogged soils

    DEFF Research Database (Denmark)

    Elberling, Bo; Matthiesen, Henning

    2007-01-01

    using a solid-state pH electrode pushed into the soil from the surface. Comparisons between in situ and laboratory methods revealed differences of more than 1 pH unit. The content of dissolved ions in soil solution and field observations of O2 and CO2 concentrations were used in the speciation model...... PHREEQE in order to predict gas exchange processes. Changes in pH in soil solution following equilibrium in the laboratory could be explained mainly by CO2 degassing. Only soil pH measured in situ using either calomel or solid-state probes inserted directly into the soil was not affected by gas exchange......We have tested the reliability and consistency of conventional pH measurements made on water-soil mixtures with respect to sieving, drying, ratio of water to soil, and time of shaking prior to measurement. The focus is on a waterlogged soil where the preservation potential of archaeological...

  2. Improving Higgs coupling measurements through ZZ Fusion at the ILC

    CERN Document Server

    Han, Tao; Qian, Zhuoni; Sayre, Joshua

    2015-01-01

    We evaluate the $e^- e^+ \\to e^- e^+ +h $ process through the $ZZ$ fusion channel at the International Linear Collider (ILC) operating at $500$ GeV and $1$ TeV center of mass energies. We perform realistic simulations on the signal process and background processes. With judicious kinematic cuts, we find that the inclusive cross section can be measured to $2.9\\%$ after combining the $500$ GeV at $500 ~\\text{fb}^{-1}$ and $1$ TeV at $1~ \\text{ab}^{-1}$ runs. A multivariate log-likelihood analysis further improves the precision of the cross section measurement to $2.3\\%$. We discuss the overall improvement to model-independent Higgs width and coupling determinations and demonstrate the use of different channels in distinguishing new physics effects in Higgs physics. Our study demonstrates the importance of the $ZZ$ fusion channel to Higgs precision physics, which has often been neglected in the literature.

  3. 学校改进研究方法论的范式转换%Transformation on the Paradigm for the Methodology of School Improvement Research

    Institute of Scientific and Technical Information of China (English)

    楚旋

    2012-01-01

    Throughout the history of school improvement research, the methodologies of school improvement research are divided into positivism, hermeneutics, and critical theory. Since 21st century, the methodology of school improvement research have transformed into the new methodology that combined by positivism, hermeneutics and critical theory. In the future, "Four categories-Triple Methodology'paradigm should be used as the methodology of school improvement, to enhance the quality of school improvement research and to direct the practice of school improvement.%纵观学校改进研究的历史可以发现,学校改进研究主要经历了实证主义、解释学、批判理论三个方法论阶段。到了信息技术时代,学校改进研究的方法论必将发生转变,三种方法论将走向整合,未来需要采用“四大范畴一三重方法论”的学校改进方法论范式开展学校改进研究,以提高学校改进研究的质量,为学校改进实践提供有效的指导。

  4. Improved degree of polarization-based differential group delay measurements.

    Science.gov (United States)

    Pye, Jason; Yevick, David

    2014-06-01

    The time-averaged Stokes vectors obtained after polarization-scrambled light containing multiple, independently polarized frequency components traverses an optical fiber collectively form a surface in Stokes space. The geometry of this surface can be directly related to the polarization mode dispersion of the fiber. This paper examines both numerically and experimentally an improved method for performing such measurements. Additionally, it quantifies the surfaces associated with input pulses containing an arbitrary set of equally spaced frequencies.

  5. Improvement of bread dough quality by Bacillus subtilis SPB1 biosurfactant addition: optimized extraction using response surface methodology.

    Science.gov (United States)

    Mnif, Inès; Besbes, Souheil; Ellouze-Ghorbel, Raoudha; Ellouze-Chaabouni, Semia; Ghribi, Dhouha

    2013-09-01

    Statistically based experimental designs were applied to Bacillus subtilis SPB1 biosurfactant extraction. The extracted biosurfactant was tested as an additive in dough formulation. The Plackett-Burman screening method showed that methanol volume, agitation speed and operating temperature affect biosurfactant extraction. The effect was studied and adjusted using response surface methodology. The optimal values were identified as 5 mL methanol, 180 rpm and 25 °C, yielding predicted responses of 2.1 ± 0.06 for the purification factor and 87.47% ± 1.58 for the retention yield. Study of the incorporation of purified lipopeptide powder into the dough preparation in comparison with a commercial surfactant - soya lecithin - reveal that SPB1 biosurfactant significantly improves the textural properties of dough (hardness, springiness, cohesion and adhesion) especially at 0.5 g kg⁻¹. At the same concentration (0.5 g kg⁻¹), the effect of SPB1 biosurfactant was more pronounced than that of soya lecithin. Also, this biosurfactant considerably enhanced the gas retention capacity in the course of fermentation. These results show that SPB1 biosurfactant could be of great interest in the bread-making industry. A method for preparative extraction of lipopeptide biosurfactant with methanol as the extraction solvent has been effectively established. © 2013 Society of Chemical Industry.

  6. Psychological Immunity Research to the Improvement of the Professional Teacher Training’s National Methodological and Training Development

    Directory of Open Access Journals (Sweden)

    Bredács Alice Mária

    2016-05-01

    Full Text Available In this study, we introduce what kind of role is played by psychological immunity and its sub-factors and its factor values in life of the students taking part in the professional training, in their performance at school, in the improvability of the students′ strengths and weaknesses. The target of the research is to renew the methodology of the professional training through becoming acquainted with the students of the new generations more exactly. Since, the new generation has changed and it is still changing even today. Their education - training is getting more and more difficult because we do not know them enough. Teachers say that the knowledge of the students, the level of their education, mainly in the specialised secondary schools, is very low because the series of their failures can be detected and the non-attendance is also typical. Much of the students do not have any relevant prospect for the future after the specialised secondary school; they do not have any targets in the long run. The teachers in the specialised secondary schools observe that students are disinterested, they miss persistence, their control ability is very low, the EQ is decreasing and their self-knowledge is imperfect. All of them can be the source of conflicts.

  7. Integrating patient satisfaction into performance measurement to meet improvement challenges.

    Science.gov (United States)

    Smith, J E; Fisher, D L; Endorf-Olson, J J

    2000-05-01

    A Value Compass has been proposed to guide health care data collection. The "compass corners" represent the four types of data needed to meet health care customer expectations: appropriate clinical outcomes, improved functional status, patient satisfaction, and appropriate costs. Collection of all four types of data is necessary to select processes in need of improvement, guide improvement teams, and monitor the success of improvement efforts. INTEGRATED DATA AT BRYANLGH: BryanLGH Medical Center in Lincoln, Nebraska, has adopted multiple performance measurement systems to collect clinical outcome, financial, and patient satisfaction data into integrated databases. Data integration allows quality professionals at BryanLGH to identify quality issues from multiple perspectives and track the interrelated effects of improvement efforts. A CASE EXAMPLE: Data from the fourth quarter of 1997 indicated the need to improve processes related to cesarean section (C-section) deliveries. An interdisciplinary team was formed, which focused on educating nurses, physicians, and the community about labor support measures. Physicians were given their own rates of C-section deliveries. The C-section rate decreased from 27% to 19%, but per-case cost increased. PickerPLUS+ results indicated that BryanLGH obstetric patients reported fewer problems with receiving information than the Picker norm, but they reported more problems with the involvement of family members and friends. The data collected so far have indicated a decrease in the C-section rate and a need to continue to work on cost and psychosocial issues. A complete analysis of results was facilitated by integrated performance management systems. Successes have been easily tracked over time, and the need for further work on related processes has been clearly identified.

  8. Minimizing measurement uncertainties of coniferous needle-leaf optical properties, part I: methodological review

    NARCIS (Netherlands)

    Yanez Rausell, L.; Schaepman, M.E.; Clevers, J.G.P.W.; Malenovsky, Z.

    2014-01-01

    Optical properties (OPs) of non-flat narrow plant leaves, i.e., coniferous needles, are extensively used by the remote sensing community, in particular for calibration and validation of radiative transfer models at leaf and canopy level. Optical measurements of such small living elements are, howeve

  9. Nuclear methods in pulmonary medicine. Methodologic considerations in mucociliary clearance and lung epithelial absorption measurements

    Energy Technology Data Exchange (ETDEWEB)

    Dolovich, M.B.; Jordana, M.; Newhouse, M.

    1987-06-01

    Measurements of mucociliary clearance and lung epithelial permeability are relatively simple to perform, with minimum discomfort to the subjects. Awareness of the factors influencing the outcome of these procedures will help to avoid errors and yield useful information about these two clearance mechanisms from both a physiological and a pathological point of view.

  10. Reanalysis of traffic enforcement data from Victoria : a methodological study into the evaluation of safety measures.

    NARCIS (Netherlands)

    Oppe, S. & Bijleveld, F.D.

    2003-01-01

    There is an increased interest in the Netherlands in the safety effects of traffic enforcement measures. This regards the intermediate effect of enforcement on behaviour as well as the final effect on safety itself. In addition to these general safety effects, regional differences are also of

  11. Minimizing measurement uncertainties of coniferous needle-leaf optical properties, part I: methodological review

    NARCIS (Netherlands)

    Yanez Rausell, L.; Schaepman, M.E.; Clevers, J.G.P.W.; Malenovsky, Z.

    2014-01-01

    Optical properties (OPs) of non-flat narrow plant leaves, i.e., coniferous needles, are extensively used by the remote sensing community, in particular for calibration and validation of radiative transfer models at leaf and canopy level. Optical measurements of such small living elements are, howeve

  12. Standardized Test Methodology for Measuring Pressure Suit Glove Performance and Demonstration Units.

    Science.gov (United States)

    1994-01-01

    Foam Sensor Data: Run 3 ......................................................................... 1 3 5 Tekscan Ink Sensor Data: Run 1...1 4 6 Tekscan Ink Sensor Data: Run 2 ... .................... ........... 1 4 7 Tekscan Ink Sensor Data...sensor and Tekscan techniques for range of motion measurement. With the foam sensor, runs 1-3, the pressure was increased by 1 psi every 60 seconds up to

  13. Improved clinical facility for in vivo nitrogen measurement.

    Science.gov (United States)

    Krishnan, S S; McNeill, K G; Mernagh, J R; Bayley, A J; Harrison, J E

    1990-04-01

    The design and construction of a hospital clinical facility for in vivo prompt gamma neutron activation analysis for total body nitrogen (TBN) measurement is described. The use of 252Cf neutron sources gives a better signal-to-background ratio compared with 238Pu-Be sources of equal strength, thus yielding better reproducibility of measurements. By measuring the hydrogen and nitrogen signals separately using appropriate gating circuits, signal-to-background ratio is further improved. Measurements using a urea phantom (5.63 kg nitrogen as urea in 34.53 kg of water) show that 2 x 6 micrograms 252Cf sources gives a nitrogen signal-to-background ratio of 5.6 (compared with 3.4 in the case of a 2 x 10 Ci 238Pu-Be source) and a reproducibility for nitrogen signal of +/- 1.1% (CV) and for hydrogen signal (internal standard) of +/- 2.33% (CV). Approximately 30 minutes of patient's time is required for each TBN measurement with an estimated reproducibility of +/- 3.8% (CV). The radiation dose to the patient is about 0.2 mSv (effective dose equivalent; QF = 10) per 20 min measurement. A report for the clinician is produced within a few minutes after the measurement by a dedicated IBM-PC computer. The entire facility is clean, comfortable and the electronics and computer processing are simple and economical.

  14. Improvements to TITAN's Mass Measurement and Decay Spectroscopy Capabilities

    CERN Document Server

    Lascar, D; Chowdhury, U; Finlay, A; Gallant, A T; Good, M; Klawitter, R; Kootte, B; Leach, K G; Lennarz, A; Leistenschneider, E; Schultz, B E; Schupp, R; Short, D A; Andreoiu, C; Dilling, J; Gwinner, G

    2015-01-01

    The study of nuclei farther from the valley of $\\beta$-stability goes hand-in-hand with shorter-lived nuclei produced in smaller abundances than their more stable counterparts. The measurement, to high precision, of nuclear masses therefore requires innovations in technique in order to keep up. TRIUMF's Ion Trap for Atomic and Nuclear science (TITAN) facility deploys three ion traps, with a fourth in the commissioning phase, to perform and support Penning trap mass spectrometry and in-trap decay spectroscopy on some of the shortest-lived nuclei ever studied. We report on recent advances and updates to the TITAN facility since the 2012 EMIS Conference. TITAN's charge breeding capabilities have been improved and in-trap decay spectroscopy can be performed in TITAN's electron beam ion trap (EBIT). Higher charge states can improve the precision of mass measurements, reduce the beam-time requirements for a given measurement, improve beam purity and opens the door to access, via in-trap decay and recapture, isotope...

  15. Metabolic tumour volumes measured at staging in lymphoma: methodological evaluation on phantom experiments and patients

    Energy Technology Data Exchange (ETDEWEB)

    Meignan, Michel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Paris-Est University, Service de Medecine Nucleaire, EAC CNRS 7054, Hopital Henri Mondor AP-HP, Creteil (France); Sasanelli, Myriam; Itti, Emmanuel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Casasnovas, Rene Olivier [CHU Le Bocage, Department of Hematology, Dijon (France); Luminari, Stefano [University of Modena and Reggio Emilia, Department of Diagnostic, Clinic and Public Health Medicine, Modena (Italy); Fioroni, Federica [Santa Maria Nuova Hospital-IRCCS, Department of Medical Physics, Reggio Emilia (Italy); Coriani, Chiara [Santa Maria Nuova Hospital-IRCCS, Department of Radiology, Reggio Emilia (Italy); Masset, Helene [Henri Mondor Hospital, Department of Radiophysics, Creteil (France); Gobbi, Paolo G. [University of Pavia, Department of Internal Medicine and Gastroenterology, Fondazione IRCCS Policlinico San Matteo, Pavia (Italy); Merli, Francesco [Santa Maria Nuova Hospital-IRCCS, Department of Hematology, Reggio Emilia (Italy); Versari, Annibale [Santa Maria Nuova Hospital-IRCCS, Department of Nuclear Medicine, Reggio Emilia (Italy)

    2014-06-15

    The presence of a bulky tumour at staging on CT is an independent prognostic factor in malignant lymphomas. However, its prognostic value is limited in diffuse disease. Total metabolic tumour volume (TMTV) determined on {sup 18}F-FDG PET/CT could give a better evaluation of the total tumour burden and may help patient stratification. Different methods of TMTV measurement established in phantoms simulating lymphoma tumours were investigated and validated in 40 patients with Hodgkin lymphoma and diffuse large B-cell lymphoma. Data were processed by two nuclear medicine physicians in Reggio Emilia and Creteil. Nineteen phantoms filled with {sup 18}F-saline were scanned; these comprised spherical or irregular volumes from 0.5 to 650 cm{sup 3} with tumour-to-background ratios from 1.65 to 40. Volumes were measured with different SUVmax thresholds. In patients, TMTV was measured on PET at staging by two methods: volumes of individual lesions were measured using a fixed 41 % SUVmax threshold (TMTV{sub 41}) and a variable visually adjusted SUVmax threshold (TMTV{sub var}). In phantoms, the 41 % threshold gave the best concordance between measured and actual volumes. Interobserver agreement was almost perfect. In patients, the agreement between the reviewers for TMTV{sub 41} measurement was substantial (ρ {sub c} = 0.986, CI 0.97 - 0.99) and the difference between the means was not significant (212 ± 218 cm{sup 3} for Creteil vs. 206 ± 219 cm{sup 3} for Reggio Emilia, P = 0.65). By contrast the agreement was poor for TMTV{sub var}. There was a significant direct correlation between TMTV{sub 41} and normalized LDH (r = 0.652, CI 0.42 - 0.8, P <0.001). Higher disease stages and bulky tumour were associated with higher TMTV{sub 41}, but high TMTV{sub 41} could be found in patients with stage 1/2 or nonbulky tumour. Measurement of baseline TMTV in lymphoma using a fixed 41% SUVmax threshold is reproducible and correlates with the other parameters for tumour mass evaluation

  16. Measuring domestic water use: A systematic review of methodologies that measure unmetered water use in low-income settings

    DEFF Research Database (Denmark)

    Tamason, Charlotte C.; Bessias, Sophia; Villada, Adriana

    2016-01-01

    Objective: To present a systematic review of methods for measuring domestic water use in settings where water meters cannot be used. Methods: We systematically searched EMBASE, PubMed, Water Intelligence Online, Water Engineering and Development Center, IEEExplore, Scielo, and Science Direct data...

  17. Translating patient reported outcome measures: methodological issues explored using cognitive interviewing with three rheumatoid arthritis measures in six European languages

    NARCIS (Netherlands)

    Hewlett, Sarah E.; Nicklin, Joanna; Bode, Christina; Carmona, Loretto; Dures, Emma; Engelbrecht, Matthias; Hagel, Sofia; Kirwan, John R.; Molto, Anna; Redondo, Marta; Gossec, Laure

    2016-01-01

    Objective. Cross-cultural translation of patient-reported outcome measures (PROMs) is a lengthy process, often performed professionally. Cognitive interviewing assesses patient comprehension of PROMs. The objective was to evaluate the usefulness of cognitive interviewing to assess translations and c

  18. Radio Weak Lensing Shear Measurement in the Visibility Domain - I. Methodology

    Science.gov (United States)

    Rivi, M.; Miller, L.; Makhathini, S.; Abdalla, F. B.

    2016-08-01

    The high sensitivity of the new generation of radio telescopes such as the Square Kilometre Array (SKA) will allow cosmological weak lensing measurements at radio wavelengths that are competitive with optical surveys. We present an adaptation to radio data of lensfit, a method for galaxy shape measurement originally developed and used for optical weak lensing surveys. This likelihood method uses an analytical galaxy model and makes a Bayesian marginalisation of the likelihood over uninteresting parameters. It has the feature of working directly in the visibility domain, which is the natural approach to adopt with radio interferometer data, avoiding systematics introduced by the imaging process. As a proof of concept, we provide results for visibility simulations of individual galaxies with flux density S ≥ 10μJy at the phase centre of the proposed SKA1-MID baseline configuration, adopting 12 frequency channels in the band 950 - 1190 MHz. Weak lensing shear measurements from a population of galaxies with realistic flux and scalelength distributions are obtained after natural gridding of the raw visibilities. Shear measurements are expected to be affected by `noise bias': we estimate the bias in the method as a function of signal-to-noise ratio (SNR). We obtain additive and multiplicative bias values that are comparable to SKA1 requirements for SNR > 18 and SNR > 30, respectively. The multiplicative bias for SNR >10 is comparable to that found in ground-based optical surveys such as CFHTLenS, and we anticipate that similar shear measurement calibration strategies to those used for optical surveys may be used to good effect in the analysis of SKA radio interferometer data.

  19. Radon-222 activity flux measurement using activated charcoal canisters: revisiting the methodology.

    Science.gov (United States)

    Alharbi, Sami H; Akber, Riaz A

    2014-03-01

    The measurement of radon ((222)Rn) activity flux using activated charcoal canisters was examined to investigate the distribution of the adsorbed (222)Rn in the charcoal bed and the relationship between (222)Rn activity flux and exposure time. The activity flux of (222)Rn from five sources of varying strengths was measured for exposure times of one, two, three, five, seven, 10, and 14 days. The distribution of the adsorbed (222)Rn in the charcoal bed was obtained by dividing the bed into six layers and counting each layer separately after the exposure. (222)Rn activity decreased in the layers that were away from the exposed surface. Nevertheless, the results demonstrated that only a small correction might be required in the actual application of charcoal canisters for activity flux measurement, where calibration standards were often prepared by the uniform mixing of radium ((226)Ra) in the matrix. This was because the diffusion of (222)Rn in the charcoal bed and the detection efficiency as a function of the charcoal depth tended to counterbalance each other. The influence of exposure time on the measured (222)Rn activity flux was observed in two situations of the canister exposure layout: (a) canister sealed to an open bed of the material and (b) canister sealed over a jar containing the material. The measured (222)Rn activity flux decreased as the exposure time increased. The change in the former situation was significant with an exponential decrease as the exposure time increased. In the latter case, lesser reduction was noticed in the observed activity flux with respect to exposure time. This reduction might have been related to certain factors, such as absorption site saturation or the back diffusion of (222)Rn gas occurring at the canister-soil interface.

  20. Explosive Strength of the Knee Extensors: The Influence of Criterion Trial Detection Methodology on Measurement Reproducibility

    Directory of Open Access Journals (Sweden)

    Dirnberger Johannes

    2016-04-01

    Full Text Available The present study was conducted to assess test-retest reproducibility of explosive strength measurements during single-joint isometric knee extension using the IsoMed 2000 dynamometer. Thirty-one physically active male subjects (mean age: 23.7 years were measured on two occasions separated by 48–72 h. The intraclass correlation coefficient (ICC 2,1 and the coefficient of variation (CV were calculated for (i maximum torque (MVC, (ii the peak rate of torque development (RTDpeak as well as for (iii the average rate of torque development (RTD and the impulse taken at several predefined time intervals (0–30 to 0–300 ms; thereby explosive strength variables were derived in two conceptually different versions: on the one hand from the MVC-trial (version I, on the other hand from the trial showing the RTDpeak (version II. High ICC-values (0.80–0.99 and acceptable CV-values (1.9–8.7% could be found for MVC as well as for the RTD and the impulse taken at time intervals of ≥100 ms, regardless of whether version I or II was used. In contrast, measurements of the RTDpeak as well as the RTD and the impulse taken during the very early contraction phase (i.e. RTD/impulse0–30ms and RTD/impulse0–50ms showed clearly weaker reproducibility results (ICC: 0.53–0.84; CV: 7.3–16.4% and gave rise to considerable doubts as to clinical usefulness, especially when derived using version I. However, if there is a need to measure explosive strength for earlier time intervals in practice, it is, in view of stronger reproducibility results, recommended to concentrate on measures derived from version II, which is based on the RTDpeak-trial.

  1. Radio weak lensing shear measurement in the visibility domain - I. Methodology

    Science.gov (United States)

    Rivi, M.; Miller, L.; Makhathini, S.; Abdalla, F. B.

    2016-12-01

    The high sensitivity of the new generation of radio telescopes such as the Square Kilometre Array (SKA) will allow cosmological weak lensing measurements at radio wavelengths that are competitive with optical surveys. We present an adaptation to radio data of lensfit, a method for galaxy shape measurement originally developed and used for optical weak lensing surveys. This likelihood method uses an analytical galaxy model and makes a Bayesian marginalization of the likelihood over uninteresting parameters. It has the feature of working directly in the visibility domain, which is the natural approach to adopt with radio interferometer data, avoiding systematics introduced by the imaging process. As a proof of concept, we provide results for visibility simulations of individual galaxies with flux density S ≥ 10 μJy at the phase centre of the proposed SKA1-MID baseline configuration, adopting 12 frequency channels in the band 950-1190 MHz. Weak lensing shear measurements from a population of galaxies with realistic flux and scalelength distributions are obtained after natural gridding of the raw visibilities. Shear measurements are expected to be affected by `noise bias': we estimate the bias in the method as a function of signal-to-noise ratio (SNR). We obtain additive and multiplicative bias values that are comparable to SKA1 requirements for SNR > 18 and SNR > 30, respectively. The multiplicative bias for SNR >10 is comparable to that found in ground-based optical surveys such as CFHTLenS, and we anticipate that similar shear measurement calibration strategies to those used for optical surveys may be used to good effect in the analysis of SKA radio interferometer data.

  2. Methodology to improve process understanding of surface runoff causing damages to buildings by analyzing insurance data records

    Science.gov (United States)

    Bernet, Daniel; Prasuhn, Volker; Weingartner, Rolf

    2015-04-01

    Several case studies in Switzerland highlight that many buildings which are damaged by floods are not located within the inundation zones of rivers, but outside the river network. In urban areas, such flooding can be caused by drainage system surcharge, low infiltration capacity of the urbanized landscape etc. However, in rural and peri-urban areas inundations are more likely caused by surface runoff formed on natural and arable land. Such flash floods have very short response time, occur rather diffusely and, thus, are very difficult to observe directly. In our approach, we use data records from private, but mostly from public insurance companies. The latter, present in 19 out of the total 26 Cantons of Switzerland, insure (almost) every building within the respective administrative zones and, in addition, hold a monopoly position. Damage claims, including flood damages, are usually recorded and, thus, data records from such public insurance companies are a very profitable data source to better understand surface runoff leading to damages. Although practitioners agree that this process is relevant, there seems to be a knowledge gap concerning spatial and temporal distributions as well as triggers and influencing factors of such damage events. Within the framework of a research project, we want to address this research gap and improve the understanding of the process chain from surface runoff formation up to possible damages to buildings. This poster introduces the methodology, which will be applied to a dataset including data from the majority of all 19 public insurance companies for buildings in Switzerland, counting over 50'000 damage claims, in order to better understand surface runoff. The goal is to infer spatial and temporal patterns as well as drivers and influencing factors of surface runoff possibly causing damages. In particular, the workflow of data acquisition, harmonization and treatment is outlined. Furthermore associated problems and challenges are

  3. Methodology developed for the simultaneous measurement of bone formation and bone resorption in rats based on the pharmacokinetics of fluoride.

    Science.gov (United States)

    Lupo, Maela; Brance, Maria Lorena; Fina, Brenda Lorena; Brun, Lucas Ricardo; Rigalli, Alfredo

    2015-01-01

    This paper describes a novel methodology for the simultaneous estimation of bone formation (BF) and resorption (BR) in rats using fluoride as a nonradioactive bone-seeker ion. The pharmacokinetics of flouride have been extensively studied in rats; its constants have all been characterized. This knowledge was the cornerstone for the underlying mathematical model that we used to measure bone fluoride uptake and elimination rate after a dose of fluoride. Bone resorption and formation were estimated by bone fluoride uptake and elimination rate, respectively. ROC analysis showed that sensitivity, specificity and area under the ROC curve were not different from deoxypiridinoline and bone alkaline phosphatase, well-known bone markers. Sprague-Dawley rats with modified bone remodelling (ovariectomy, hyper, and hypocalcic diet, antiresorptive treatment) were used to validate the values obtained with this methodology. The results of BF and BR obtained with this technique were as expected for each biological model. Although the method should be performed under general anesthesia, it has several advantages: simultaneous measurement of BR and BF, low cost, and the use of compounds with no expiration date.

  4. Fiber-Optic Temperature and Pressure Sensors Applied to Radiofrequency Thermal Ablation in Liver Phantom: Methodology and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Daniele Tosi

    2015-01-01

    Full Text Available Radiofrequency thermal ablation (RFA is a procedure aimed at interventional cancer care and is applied to the treatment of small- and midsize tumors in lung, kidney, liver, and other tissues. RFA generates a selective high-temperature field in the tissue; temperature values and their persistency are directly related to the mortality rate of tumor cells. Temperature measurement in up to 3–5 points, using electrical thermocouples, belongs to the present clinical practice of RFA and is the foundation of a physical model of the ablation process. Fiber-optic sensors allow extending the detection of biophysical parameters to a vast plurality of sensing points, using miniature and noninvasive technologies that do not alter the RFA pattern. This work addresses the methodology for optical measurement of temperature distribution and pressure using four different fiber-optic technologies: fiber Bragg gratings (FBGs, linearly chirped FBGs (LCFBGs, Rayleigh scattering-based distributed temperature system (DTS, and extrinsic Fabry-Perot interferometry (EFPI. For each instrument, methodology for ex vivo sensing, as well as experimental results, is reported, leading to the application of fiber-optic technologies in vivo. The possibility of using a fiber-optic sensor network, in conjunction with a suitable ablation device, can enable smart ablation procedure whereas ablation parameters are dynamically changed.

  5. Comparison of noise power spectrum methodologies in measurements by using various electronic portal imaging devices in radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Son, Soon Yong [Dept. of Radiological Technology, Wonkwang Health Science University, Iksan (Korea, Republic of); Choi, Kwan Woo [Dept. of Radiology, Asan Medical Center, Seoul (Korea, Republic of); Jeong, Hoi Woun [Dept. of Radiological Technology, Baekseok Culture University College, Cheonan (Korea, Republic of); Kwon, Kyung Tae [Dep. of Radiological Technology, Dongnam Health University, Suwon (Korea, Republic of); Kim, Ki Won [Dept. of Radiology, Kyung Hee University Hospital at Gang-dong, Seoul (Korea, Republic of); Lee, Young Ah; Son, Jin Hyun; Min, Jung Whan [Shingu University College, Sungnam (Korea, Republic of)

    2016-03-15

    The noise power spectrum (NPS) is one of the most general methods for measuring the noise amplitude and the quality of an image acquired from a uniform radiation field. The purpose of this study was to compare different NPS methodologies by using megavoltage X-ray energies. The NPS evaluation methods in diagnostic radiation were applied to therapy using the International Electro-technical Commission standard (IEC 62220-1). Various radiation therapy (RT) devices such as TrueBeamTM(Varian), BEAMVIEWPLUS(Siemens), iViewGT(Elekta) and ClinacR iX (Varian) were used. In order to measure the region of interest (ROI) of the NPS, we used the following four factors: the overlapping impact, the non-overlapping impact, the flatness and penumbra. As for NPS results, iViewGT(Elekta) had the higher amplitude of noise, compared to BEAMVIEWPLUS (Siemens), TrueBeamTM(Varian) flattening filter, ClinacRiXaS1000(Varian) and TrueBeamTM(Varian) flattening filter free. The present study revealed that various factors could be employed to produce megavoltage imaging (MVI) of the NPS and as a baseline standard for NPS methodologies control in MVI.

  6. Quantitative Developments of Biomolecular Databases, Measurement Methodology, and Comprehensive Transport Models for Bioanalytical Microfluidics

    Science.gov (United States)

    2006-10-01

    various microfluidic devices such as: D1. Traveling wave DEP based Flow Field Fractionation device D2 . Sample Stacking device D3 . Tunable Laser...bottom floor of the channel as shown in Figure 2.1. The electrodes have a width of d2 and a gap of 2d1 between adjacent electrodes. They are energized by...between vitamin B12 binding proteins and cobalamins by surface plasmon resonance”, Anal. Biochem. 305, 1-9. P133. Myszka, D. G., “Improving Biosensor

  7. Dual Wavelength Laser Writing and Measurement Methodology for High Resolution Bimetallic Grayscale Photomasks

    Science.gov (United States)

    Qarehbaghi, Reza

    Grayscale bimetallic photomasks consist of bi-layer thermal resists (Bismuth-on-Indium or Tin-on-Indium) which become controllably transparent when exposed to a focused laser beam as a function of the absorbed power changing from ~3OD (unexposed) to writing. This thesis investigates using two wavelength beams for mask writing (514.5nm) and OD measurement (457.9nm) separated from a multi-line Argon ion laser source: a Dual Wavelength Writing and Measurement System. The writing laser profile was modified to a top-hat using a beam shaper. Several mask patterns tested the creation of high resolution grayscale masks. Finally, for creation of 3D structures in photoresist, the mask transparency to resist thickness requirements was formulated and linear slope patterns were successfully created.

  8. New methodology for simultaneous volumetric and calorimetric measurements: Direct determination of αp and Cp for liquids under pressure

    Science.gov (United States)

    Casás, L. M.; Plantier, F.; Bessières, D.

    2009-12-01

    A new batch cell has been developed to measure simultaneously both isobaric thermal expansion and isobaric heat capacity from calorimetric measurements. The isobaric thermal expansion is directly proportional to the linear displacement of an inner flexible below and the heat capacity is calculated from the calorimetric signal. The apparatus used was a commercial Setaram C-80 calorimeter and together with this type of vessels can be operated up to 20 MPa and in the temperature range of 303.15-523.15 K, In this work, calibration was carried out using 1-hexanol and subsequently both thermophysical properties were determined for 3-pentanol, 3-ethyl-3-pentanol, and 1-octanol at atmospheric pressure, 5 and 10 MPa, and from 303.15 to 423.15 K in temperature. Finally experimental values were compared with the literature in order to validate this new methodology, which allows a very accurate determination of isobaric thermal expansion and isobaric heat capacity.

  9. A methodology to measure the effectiveness of academic recruitment and turnover

    DEFF Research Database (Denmark)

    Abramo, Giovanni; D’Angelo, Ciriaco Andrea; Rosati, Francesco

    2016-01-01

    We propose a method to measure the effectiveness of the recruitment and turnover of professors, in terms of their research performance. The method presented is applied tothe case of Italian universities over the period 2008–2012. The work then analyses thecorrelation between the indicators...... of effectiveness used, and between the indicators andthe universities’ overall research performance. In countries that conduct regular nationalassessment exercises, the evaluation of effectiveness in recruitment and turnover couldcomplement the overall research assessments. In particular, monitoring...

  10. The Study of Productivity Measurement and Incentive Methodology (Phase III - Paper Test). Volume 1

    Science.gov (United States)

    1986-03-14

    goal is to prepare an implementation report/ manual that guides others in execution of recommendations and alternative approaches identified in...be machine, assembly and/or test labor, a contractor can submit an IMIP to automate a manual process for simple, average and/or complex machine shop...structure has created the need to address both produccion and non-productxon costs. VAPO has, therefore, expanded the application of measurements

  11. A novel methodology to measure methane bubble sizes in the water column

    Science.gov (United States)

    Hemond, H.; Delwiche, K.; Senft-Grupp, S.; Manganello, T.

    2014-12-01

    The fate of methane ebullition from lake sediments is dependent on initial bubble size. Rising bubbles are subject to dissolution, reducing the fraction of methane that ultimately enters the atmosphere while increasing concentrations of aqueous methane. Smaller bubbles not only rise more slowly, but dissolve more rapidly larger bubbles. Thus, understanding methane bubble size distributions in the water column is critical to predicting atmospheric methane emissions from ebullition. However, current methods of measuring methane bubble sizes in-situ are resource-intensive, typically requiring divers, video equipment, sonar, or hydroacoustic instruments. The complexity and cost of these techniques points to the strong need for a simple, autonomous device that can measure bubble size distributions and be deployed unattended over long periods of time. We describe a bubble sizing device that can be moored in the subsurface and can intercept and measure the size of bubbles as they rise. The instrument uses a novel optical measurement technique with infrared LEDs and IR-sensitive photodetectors combined with a custom-designed printed circuit board. An on-board microcomputer handles raw optical signals and stores the relevant information needed to calculate bubble volume. The electronics are housed within a pressure case fabricated from standard PVC fittings and are powered by size C alkaline batteries. The bill of materials cost is less than $200, allowing us to deploy multiple sensors at various locations within Upper Mystic Lake, MA. This novel device will provide information on how methane bubble sizes may vary both spatially and temporally. We present data from tests under controlled laboratory conditions and from deployments in Upper Mystic Lake.

  12. Methodological Study on AMS Measurement of Ultra-trace Pu Isotope Ratios at CIAE

    Institute of Scientific and Technical Information of China (English)

    DONG; Ke-jun; ZHAO; Qing-zhang; WANG; Chen; HE; Ming; JIANG; Shan; ZHANG; Hui; PANG; Yi-jun; SHEN; Hong-tao; WANG; Xiao-ming; XU; Yong-ning; WU; Shao-yong; YANG; Xu-ran; WANG; Xiang-gao

    2015-01-01

    The determination of ultra-trace plutonium is very important in different fields.A new measurement method of plutonium isotopic ratios with accelerator mass spectrometry(AMS)was developed at China Institute of Atomic Energy(CIAE).Two laboratory reference standards of 239Pu/240 Pu(ST1)and 239Pu/242 Pu(ST2)are17.241and 10.059,a flow blank,a commercial blank and three real samples were respectively

  13. Parallel Measurements of Light Scattering and Characterization of Marine Particles in Water: An Evaluation of Methodology

    Science.gov (United States)

    2008-01-01

    turbid estuary of the Tijuana River to clear oligotrophic waters. To further expand the range of water types, additional field measurements are... Tijuana River to clear oligotrophic ocean water. As part of a NASA funded project, we sampled coastal waters on a regular basis to examine temporal... Beach and Scripps Pier. The two starred samples in panel A represent experiments with 100 μm microspheres, from which reliable concentration estimates

  14. Bone mineral content measurement in small infants by single-photon absorptiometry: current methodologic issues

    Energy Technology Data Exchange (ETDEWEB)

    Steichen, J.J.; Asch, P.A.; Tsang, R.C.

    1988-07-01

    Single-photon absorptiometry (SPA), developed in 1963 and adapted for infants by Steichen et al. in 1976, is an important tool to quantitate bone mineralization in infants. Studies of infants in which SPA was used include studies of fetal bone mineralization and postnatal bone mineralization in very low birth weight infants. The SPA technique has also been used as a research tool to investigate longitudinal bone mineralization and to study the effect of nutrition and disease processes such as rickets or osteopenia of prematurity. At present, it has little direct clinical application for diagnosing bone disease in single patients. The bones most often used to measure bone mineral content (BMC) are the radius, the ulna, and, less often, the humerus. The radius appears to be preferred as a suitable bone to measure BMC in infants. It is easily accessible; anatomic reference points are easily palpated and have a constant relationship to the radial mid-shaft site; soft tissue does not affect either palpation of anatomic reference points or BMC quantitation in vivo. The peripheral location of the radius minimizes body radiation exposure. Trabecular and cortical bone can be measured separately. Extensive background studies exist on radial BMC in small infants. Most important, the radius has a relatively long zone of constant BMC. Finally, SPA for BMC in the radius has a high degree of precision and accuracy. 61 references.

  15. Improving Constraints on Proton Structure using CMS measurements

    CERN Document Server

    Ghosh, Saranya Samik

    2015-01-01

    Production of electroweak bosons, heavy quarks and jets in proton-proton collisions probe different aspects of QCD and are sensitive to the details of proton structure, expressed by parton distribution functions (PDFs). Precise measurements of cross sections of these processes are used by the CMS experiment to demonstrate the impact of the LHC data on the PDFs and their precision. The measurements of muon charge asymmetry in W-boson production at a center-of-mass of 7 and 8 TeV is used to improve the constraints on the valence-quark distributions, while the associated production of W-boson and charm quark provides information on the s-quark distribution in the proton. Production of inclusive jets, as measured by CMS at center-of-mass energy of 7 TeV, provides important constraints on the gluon distribution.

  16. An improved measurement of muon antineutrino disappearance in MINOS

    CERN Document Server

    Adamson, P; Backhouse, C; Barr, G; Bishai, M; Blake, A; Bock, G J; Boehnlein, D J; Bogert, D; Cao, S V; Childress, S; Coelho, J A B; Corwin, L; Cronin-Hennessy, D; Danko, I Z; de Jong, J K; Devenish, N E; Diwan, M V; Escobar, C O; Evans, J J; Falk, E; Feldman, G J; Frohne, M V; Gallagher, H R; Gomes, R A; Goodman, M C; Gouffon, P; Graf, N; Gran, R; Grzelak, K; Habig, A; Hartnell, J; Hatcher, R; Himmel, A; Holin, A; Huang, X; Hylen, J; Irwin, G M; Isvan, Z; Jaffe, D E; James, C; Jensen, D; Kafka, T; Kasahara, S M S; Koizumi, G; Kopp, S; Kordosky, M; Kreymer, A; Lang, K; Ling, J; Litchfield, P J; Loiacono, L; Lucas, P; Mann, W A; Marshak, M L; Mathis, M; Mayer, N; Mehdiyev, R; Meier, J R; Messier, M D; Michael, D G; Miller, W H; Mishra, S R; Mitchell, J; Moore, C D; Mualem, L; Mufson, S; Musser, J; Naples, D; Nelson, J K; Newman, H B; Nichol, R J; Nowak, J A; Oliver, W P; Orchanian, M; Pahlka, R B; Paley, J; Patterson, R B; Pawloski, G; Phan-Budd, S; Plunkett, R K; Qiu, X; Radovic, A; Ratchford, J; Rebel, B; Rosenfeld, C; Rubin, H A; Sanchez, M C; Schneps, J; Schreckenberger, A; Schreiner, P; Sharma, R; Sousa, A; Strait, M; Tagg, N; Talaga, R L; Thomas, J; Thomson, M A; Tinti, G; Toner, R; Torretta, D; Tzanakos, G; Urheim, J; Vahle, P; Viren, B; Walding, J J; Weber, A; Webb, R C; White, C; Whitehead, L; Wojcicki, S G; Zwaska, R

    2012-01-01

    We report an improved measurement of muon anti-neutrino disappearance over a distance of 735km using the MINOS detectors and the Fermilab Main Injector neutrino beam in a muon anti-neutrino enhanced configuration. From a total exposure of 2.95e20 protons on target, of which 42% have not been previously analyzed, we make the most precise measurement of the anti-neutrino "atmospheric" delta-m squared = 2.62 +0.31/-0.28 (stat.) +/- 0.09 (syst.) and constrain the anti-neutrino atmospheric mixing angle >0.75 (90%CL). These values are in agreement with those measured for muon neutrinos, removing the tension reported previously.

  17. Improved measurement of muon antineutrino disappearance in MINOS.

    Science.gov (United States)

    Adamson, P; Ayres, D S; Backhouse, C; Barr, G; Bishai, M; Blake, A; Bock, G J; Boehnlein, D J; Bogert, D; Cao, S V; Childress, S; Coelho, J A B; Corwin, L; Cronin-Hennessy, D; Danko, I Z; de Jong, J K; Devenish, N E; Diwan, M V; Escobar, C O; Evans, J J; Falk, E; Feldman, G J; Frohne, M V; Gallagher, H R; Gomes, R A; Goodman, M C; Gouffon, P; Graf, N; Gran, R; Grzelak, K; Habig, A; Hartnell, J; Hatcher, R; Himmel, A; Holin, A; Huang, X; Hylen, J; Irwin, G M; Isvan, Z; Jaffe, D E; James, C; Jensen, D; Kafka, T; Kasahara, S M S; Koizumi, G; Kopp, S; Kordosky, M; Kreymer, A; Lang, K; Ling, J; Litchfield, P J; Loiacono, L; Lucas, P; Mann, W A; Marshak, M L; Mathis, M; Mayer, N; Mehdiyev, R; Meier, J R; Messier, M D; Michael, D G; Miller, W H; Mishra, S R; Mitchell, J; Moore, C D; Mualem, L; Mufson, S; Musser, J; Naples, D; Nelson, J K; Newman, H B; Nichol, R J; Nowak, J A; Oliver, W P; Orchanian, M; Pahlka, R B; Paley, J; Patterson, R B; Pawloski, G; Phan-Budd, S; Plunkett, R K; Qiu, X; Radovic, A; Ratchford, J; Rebel, B; Rosenfeld, C; Rubin, H A; Sanchez, M C; Schneps, J; Schreckenberger, A; Schreiner, P; Sharma, R; Sousa, A; Strait, M; Tagg, N; Talaga, R L; Thomas, J; Thomson, M A; Tinti, G; Toner, R; Torretta, D; Tzanakos, G; Urheim, J; Vahle, P; Viren, B; Walding, J J; Weber, A; Webb, R C; White, C; Whitehead, L; Wojcicki, S G; Zwaska, R

    2012-05-11

    We report an improved measurement of ν(μ) disappearance over a distance of 735 km using the MINOS detectors and the Fermilab Main Injector neutrino beam in a ν(μ)-enhanced configuration. From a total exposure of 2.95×10(20) protons on target, of which 42% have not been previously analyzed, we make the most precise measurement of Δm2=[2.62(-0.28)(+0.31)(stat)±0.09(syst)]×10(-3)  eV2 and constrain the ν(μ) mixing angle sin2(2θ)>0.75 (90% C.L.). These values are in agreement with Δm2 and sin2(2θ) measured for ν(μ), removing the tension reported in [P. Adamson et al. (MINOS), Phys. Rev. Lett. 107, 021801 (2011).].

  18. In situ measurement of heavy metals in water using portable EDXRF and APDC pre-concentration methodology

    Energy Technology Data Exchange (ETDEWEB)

    Melquiades, Fabio L. [Universidade Estadual do Centro-Oeste, Guarapuava, PR (Brazil). Dept. de Fisica], E-mail: fmelquiades@unicentro.br; Parreira, Paulo S.; Appoloni, Carlos R.; Silva, Wislley D.; Lopes, Fabio [Universidade Estadual de Londrina (UEL), PR (Brazil). Dept. de Fisica], E-mail: parreira@uel.br, E-mail: appoloni@uel.br

    2007-07-01

    With the objective of identify and quantify metals in water and obtain results in the sampling place, Energy Dispersive X-Ray Fluorescence (EDXRF) methodology with a portable equipment was employed. In this work are presented metal concentration results for water samples from two points of Londrina city. The analysis were in situ, measuring in natura water and samples pre-concentrated in membranes. The work consisted on the use of a portable X-ray tube to excite the samples and a Si-Pin detector with the standard data acquisition electronics to register the spectra. The samples were filtered in membranes for suspended particulate matter retention. After this APDC precipitation methodology was applied for sample pre-concentration with posterior filtering in membranes. For in natura samples were found concentrations of total iron in Capivara River 254 {+-} 30 mg L{sup -1} and at Igapo Lake 63 {+-} 9 mg L{sup -1}. For membrane measurements, the results for particulate suspended matter at Capivara River were, in mg L{sup -1}: 31.0 {+-} 2.5 (Fe), 0.17 {+-} 0.03 (Cu) and 0.93 {+-} 0.08 (Pb) and for dissolved iron was 0.038 {+-} 0.004. For Igapo Lake just Fe was quantified: 1.66 {+-}0.19 mg L{sup -1} for particulate suspended iron and 0.79 {+-} 0.11 mg L{sup -1} for dissolved iron. In 4 h of work at field it was possible to filter 14 membranes and measure around 16 samples. The performance of the equipment was very good and the results are satisfactory for in situ measurements employing a portable instrument. (author)

  19. Improving measurement technology for the design of sustainable cities

    Science.gov (United States)

    Pardyjak, Eric R.; Stoll, Rob

    2017-09-01

    This review identifies and discusses measurement technology gaps that are currently preventing major science leaps from being realized in the study of urban environmental transport processes. These scientific advances are necessary to better understand the links between atmospheric transport processes in the urban environment, human activities, and potential management strategies. We propose that with various improved and targeted measurements, it will be possible to provide technically sound guidance to policy and decision makers for the design of sustainable cities. This review focuses on full-scale in situ and remotely sensed measurements of atmospheric winds, temperature, and humidity in cities and links measurements to current modeling and simulation needs. A key conclusion of this review is that there is a need for urban-specific measurement techniques including measurements of highly-resolved three-dimensional fields at sampling frequencies high enough to capture small-scale turbulence processes yet also capable of covering spatial extents large enough to simultaneously capture key features of urban heterogeneity and boundary layer processes while also supporting the validation of current and emerging modeling capabilities.

  20. An improved measurement model of binocular vision using geometrical approximation

    Science.gov (United States)

    Wang, Qiyue; Wang, Zhongyu; Yao, Zhenjian; Forrest, Jeffrey; Zhou, Weihu

    2016-12-01

    In order to improve the precision of a binocular vision measurement system, an effective binocular vision measurement method, named geometrical approximation, is proposed. This method can optimize the measurement results by geometrical approximation operation based on the principles of optimization theory and spatial geometry. To evaluate the properties of the proposed method, both simulative and practical experiments are carried out. The influence of image noise and focal length error on measurement results is discussed. The results show that measurement performance of the proposed method is manifested well. Besides, the proposed method is also compared with Bundle adjustment and least squares method in a practical experiment. The experiment results indicate that the average error, calculated by using the proposed method, is 0.076 mm less than Bundle adjustment’s 0.085 mm, and only half of the least squares method’s 0.146 mm. At the meantime, the proposed method enjoys a high level of computational efficiency when compared to Bundle adjustment. Since no nonlinear iteration optimization is involved, this method can be applied readily to real time on-line measurements.

  1. Evaluation of NaCl Effect on Vibration-Delaminated Metal-Polymer Composites by Improved Micro-Raman Methodology

    Directory of Open Access Journals (Sweden)

    E. Zumelzu

    2013-01-01

    Full Text Available Polyethylene terephthalate (PET is a polymer coating that protects the electrolytic chromium coated steel (ECCS against aggressive electrolytes like NaCl. It is widely accepted by manufacturers that NaCl has no effect on the PET coating, which is inert. However, we showed that there are some effects at the structural level, caused by vibrations, and facilitated by defects on the layers. The vibrations occurring during the transportation of food containers produce delaminations at given points of the metal-polymer interface, known as antinodes, which in turn may produce PET degradation affecting food quality. The former can be determined by electrochemical measurements, and the changes in composition or structural order can be characterized by Raman. The present work applied this latter technique in experimental samples of PET-coated ECCS sheets by performing perpendicular and parallel analyses to the surface, and determined that it constitutes a new potential methodology to determine the behavior of the composite under the above conditions. The results demonstrated that the delamination areas on the PET facilitated polymer degradation by the electrolyte. Moreover, the Raman characterization evidenced the presence of multilayers and crystalline orderings, which limited its functionality as a protective coating.

  2. Measuring gross and net calcification of a reef coral under ocean acidification conditions: methodological considerations

    Directory of Open Access Journals (Sweden)

    S. Cohen

    2012-07-01

    Full Text Available Ongoing ocean acidification (OA is rapidly altering carbonate chemistry in the oceans. The projected changes will likely have deleterious consequences for coral reefs by negatively affecting their growth. Nonetheless, diverse responses of reef-building corals calcification to OA hinder our ability to decipher reef susceptibility to elevated pCO2. Some of the inconsistencies between studies originate in measuring net calcification (NC, which does not always consider the proportions of the "real" (gross calcification (GC and gross dissolution in the observed response. Here we show that microcolonies of Stylophora pistillata (entirely covered by tissue, incubated under normal (8.2 and reduced (7.6 pH conditions for 16 months, survived and added new skeletal CaCO3, despite low (1.25 Ωarg conditions. Moreover, corals maintained their NC and GC rates under reduced (7.6 pH conditions and displayed positive NC rates at the low-end (7.3 pH treatment while bare coral skeleton underwent marked dissolution. Our findings suggest that S. pistillata may fall into the "low sensitivity" group with respect to OA and that their overlying tissue may be a key determinant in setting their tolerance to reduced pH by limiting dissolution and allowing them to calcify. This study is the first to measure GC and NC rates for a tropical scleractinian corals under OA conditions. We provide a detailed, realistic assessment of the problematic nature of previously accepted methods for measuring calcification (total alkalinity and 45Ca.

  3. Orientation Uncertainty of Structures Measured in Cored Boreholes: Methodology and Case Study of Swedish Crystalline Rock

    Science.gov (United States)

    Stigsson, Martin

    2016-11-01

    Many engineering applications in fractured crystalline rocks use measured orientations of structures such as rock contact and fractures, and lineated objects such as foliation and rock stress, mapped in boreholes as their foundation. Despite that these measurements are afflicted with uncertainties, very few attempts to quantify their magnitudes and effects on the inferred orientations have been reported. Only relying on the specification of tool imprecision may considerably underestimate the actual uncertainty space. The present work identifies nine sources of uncertainties, develops inference models of their magnitudes, and points out possible implications for the inference on orientation models and thereby effects on downstream models. The uncertainty analysis in this work builds on a unique data set from site investigations, performed by the Swedish Nuclear Fuel and Waste Management Co. (SKB). During these investigations, more than 70 boreholes with a maximum depth of 1 km were drilled in crystalline rock with a cumulative length of more than 34 km including almost 200,000 single fracture intercepts. The work presented, hence, relies on orientation of fractures. However, the techniques to infer the magnitude of orientation uncertainty may be applied to all types of structures and lineated objects in boreholes. The uncertainties are not solely detrimental, but can be valuable, provided that the reason for their presence is properly understood and the magnitudes correctly inferred. The main findings of this work are as follows: (1) knowledge of the orientation uncertainty is crucial in order to be able to infer correct orientation model and parameters coupled to the fracture sets; (2) it is important to perform multiple measurements to be able to infer the actual uncertainty instead of relying on the theoretical uncertainty provided by the manufacturers; (3) it is important to use the most appropriate tool for the prevailing circumstances; and (4) the single most

  4. Conceptual and methodological challenges to measuring political commitment to respond to HIV

    Directory of Open Access Journals (Sweden)

    Fox Ashley M

    2011-09-01

    Full Text Available Abstract Background Researchers have long recognized the importance of a central government’s political “commitment” in order to mount an effective response to HIV. The concept of political commitment remains ill-defined, however, and little guidance has been given on how to measure this construct and its relationship with HIV-related outcomes. Several countries have experienced declines in HIV infection rates, but conceptual difficulties arise in linking these declines to political commitment as opposed to underlying social and behavioural factors. Methods This paper first presents a critical review of the literature on existing efforts to conceptualize and measure political commitment to respond to HIV and the linkages between political commitment and HIV-related outcomes. Based on the elements identified in this review, the paper then develops and presents a framework to assist researchers in making choices about how to assess a government's level of political commitment to respond to HIV and how to link political commitment to HIV-related outcomes. Results The review of existing studies identifies three components of commitment (expressed, institutional and budgetary commitment as different dimensions along which commitment can be measured. The review also identifies normative and ideological aspects of commitment and a set of variables that mediate and moderate political commitment that need to be accounted for in order to draw valid inferences about the relationship between political commitment and HIV-related outcomes. The framework summarizes a set of steps that researchers can follow in order to assess a government's level of commitment to respond to HIV and suggests ways to apply the framework to country cases. Conclusions Whereas existing studies have adopted a limited and often ambiguous conception of political commitment, we argue that conceiving of political commitment along a greater number of dimensions will allow

  5. An evaluation of advantages and cost measurement methodology for leasing in the health care industry.

    Science.gov (United States)

    Henry, J B; Roenfeldt, R L

    1977-01-01

    Lease financing in hospitals is growing rapidly. Many articles published on the topic of lease financing point only to the benefits that may be derived. Very few articles actually analyze the pros and cons of leasing from a financial cost measurement point of view, which includes real world parameters. This article critically evaluates two articles published in this issue which lead the reader to believe leasing for the most part is a bargain when compared to debt financing. The authors discuss some misconceptions in these articles and point out some facts viewed from a financial analyst's position.

  6. Complete methodology on generating realistic wind speed profiles based on measurements

    DEFF Research Database (Denmark)

    Gavriluta, Catalin; Spataru, Sergiu; Mosincat, Ioan;

    2012-01-01

    , wind modelling for medium and large time scales is poorly treated in the present literature. This paper presents methods for generating realistic wind speed profiles based on real measurements. The wind speed profile is divided in a low- frequency component (describing long term variations......The wind speed represents the main exogenous signal applied to a Wind Energy Conversion System (WECS) and determines its behavior. The erratic variation of the wind speed, highly dependent on the given site and on the atmospheric conditions, makes the wind speed quite difficult to model. Moreover...

  7. Proposal of methodology and test protocol for evaluating and qualifying pH measuring devices

    Directory of Open Access Journals (Sweden)

    Niza Helena de Almeida

    2006-01-01

    Full Text Available We present a proposal for evaluating and qualifying pH measuring devices based on the requirements of relevant standards. The proposal presented is based on ASTM E70, NBR 7353, JIS Z 8805, BS 3145, DIN 19268, NBR ISO 17025 and other standards, as well as the results of field research carried out in conjunction with professionals performing pH measurements in public health laboratories. Evaluation is performed by inspection of a form which records data from the measuring system. The form gives acceptable variations in the parameters being tested and allows a conclusion to be reached regarding acceptability of the system. Using the proposed protocol allows definition of suitable analysis criteria, while taking into account the influence pH measurement is subject and the need for correct results. This is particularly true when analyzing products already on the market, thus underlining the protocol's importance to the public health area.Este artigo apresenta uma proposta de protocolo de avaliação e qualificação de medidores de pH fundamentada no que prescrevem as normas ASTM E 70, NBR 7353, JIS Z 8805, BS 3145, DIN 19268, NBR ISO 17025 e outras, complementada com os resultados de pesquisa de campo junto a profissionais que realizam ensaio de medição de pH em laboratórios de saúde pública. A proposta consiste em avaliar o medidor de pH com auxílio de um formulário, cujo preenchimento baseia-se, principalmente, na inspeção e ensaios no sistema medidor. O formulário fornece variações aceitáveis para os parâmetros testados, propiciando parecer conclusivo quanto à adequação do instrumento. A aplicação do protocolo permite definir um critério adequado de análise, tendo em vista a influência que sofre tal ensaio de medição, especialmente, em análises de amostras de produtos decorrentes de finalidade fiscal, no âmbito da saúde pública

  8. Surface strains induced by measured loads on teeth in vivo: a methodological study.

    Science.gov (United States)

    Nohl, F S; Setchell, D J

    2000-03-01

    Visual feedback enabled three subjects to apply predetermined near-axial loads to the incisal edge of an intact maxillary central incisor. In two subjects, principal strains and orientations developed on the labial surface of the intact incisor were resolved from strains recorded with a multiple element strain gauge. Load application was accurate and precise enough to allow resolution of strains induced by target loads of 10 to 50 N. Axially orientated compressive labial surface strains were induced by measured loads. The method could be used to validate bench-top stress analyses and investigate the effects of restoration on the structural integrity of teeth.

  9. A Device and Methodology for Measuring Repetitive Lifting VO2max (Oxygen Consumption Rate)

    Science.gov (United States)

    1987-08-01

    and its estimate from skinfold thicknesses ; measurements on 481 men and women aged from 16 to 72 years. Br J Nutr 32:77-92. 3. Intaranont K, Ayoub MM...Justificaton --. By ......... AvaI~bty Co’der, L Ust Avdi iUl I r /1- Table of Contents Table of Contents iii List of Figures iv List of Tables v...during 28 repetitive lifting exercise iv List of Tables 1. Repetitive lifting device specifications 15 2. Subject sample descriptive data 24 3

  10. Methodology and calibration for continuous measurements of biogeochemical trace gas and O2 concentrations from a 300-m tall tower in central Siberia

    Directory of Open Access Journals (Sweden)

    E. A. Kozlova

    2009-05-01

    Full Text Available We present an integrated system for measuring atmospheric concentrations of CO2, O2, CH4, CO, and N2O in central Siberia. Our project aims to demonstrate the feasibility of establishing long-term, continuous, high precision atmospheric measurements to elucidate greenhouse gas processes from a very remote, mid-continental boreal environment. Air is sampled from five heights on a custom-built 300-m tower. Common features to all species' measurements include air intakes, an air drying system, flushing procedures, and data processing methods. Calibration standards are shared among all five measured species by extending and optimising a proven methodology for long-term O2 calibration. Our system achieves the precision and accuracy requirements specified by the European Union's "CarboEurope" and "ICOS" (Integrated Carbon Observing System programmes in the case of CO2, O2, and CH4, while CO and N2O require some further improvements. It was found that it is not possible to achieve these high precision measurements without skilled technical assistance on-site, primarily because of 2–3 month delays in access to data and diagnostic information. We present results on the stability of reference standards in high pressure cylinders. It was also found that some previous methods do not mitigate fractionation of O2 in a sample airstream to a satisfactory level.

  11. ISD: A New Methodological Approach for Measuring the Sustainability of the German Energy System

    Directory of Open Access Journals (Sweden)

    Holger Schlör

    2011-01-01

    Full Text Available The research community has developed three main concepts and indicator systems to measure sustainability: the capital concept, the ecological concept and the multidimensional concept. Whereas a lot of research has been dedicated to the pros and cons of the three/four-pillar sustainability concept, to the shaping of the pillars and their indicators, research on standardized methods to aggregate the indicators to one index is lacking. However, a useful model exists—the GDP—which summarizes the different economic activities of various social actors in one index. An overall sustainability index has the advantage that the sustainability of a system can be expressed in one index. This allows the sustainability status of a system to be better communicated both to the public and to politicians. Against this background, we developed the Index of Sustainable Development (ISD to measure the sustainability of systems described by multidimensional sustainability concepts. We demonstrate that it is possible to aggregate sustainability indicators of the multidimensional sustainability concepts to one index. We have chosen exemplarily the German sustainability strategy and selected the energy indicators within it because of the importance of the energy sector and due to the good statistical database in this sector.

  12. A Methodological Evaluation of Volumetric Measurement Techniques including Three-Dimensional Imaging in Breast Surgery

    Directory of Open Access Journals (Sweden)

    H. Hoeffelin

    2014-01-01

    Full Text Available Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D and its clinical application. We tested the use of the 3D LifeViz system (Quantificare to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prostheses, and in clinical patients and we compared this system to other techniques (CT scanning and Archimedes’ principle under the same conditions. We were able to identify the benefits (feasibility, safety, portability, and low patient stress and limitations (underestimation of the in situ volume, subjectivity of contouring, and patient selection of the LifeViz 3D system, concluding that the results are comparable with other measurement techniques. The prospects of this technology seem promising in numerous applications in clinical practice to limit the subjectivity of breast surgery.

  13. A methodological evaluation of volumetric measurement techniques including three-dimensional imaging in breast surgery.

    Science.gov (United States)

    Hoeffelin, H; Jacquemin, D; Defaweux, V; Nizet, J L

    2014-01-01

    Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D) and its clinical application. We tested the use of the 3D LifeViz system (Quantificare) to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prostheses, and in clinical patients) and we compared this system to other techniques (CT scanning and Archimedes' principle) under the same conditions. We were able to identify the benefits (feasibility, safety, portability, and low patient stress) and limitations (underestimation of the in situ volume, subjectivity of contouring, and patient selection) of the LifeViz 3D system, concluding that the results are comparable with other measurement techniques. The prospects of this technology seem promising in numerous applications in clinical practice to limit the subjectivity of breast surgery.

  14. Comparison of different soil organic matter fractionation methodologies: Evidences from ultrasensitive {sup 14}C measurements

    Energy Technology Data Exchange (ETDEWEB)

    Marzaioli, Fabio, E-mail: fabio.marzaioli@unina2.i [CIRCE, Dipartimento di Scienze Ambientali, Seconda Universita degli studi di Napoli and INNOVA, Via Vivaldi, 43, Caserta 81100 (Italy); Lubritto, Carmine; Galdo, Ilaria Del; D' Onofrio, Antonio [CIRCE, Dipartimento di Scienze Ambientali, Seconda Universita degli studi di Napoli and INNOVA, Via Vivaldi, 43, Caserta 81100 (Italy); Cotrufo, M. Francesca [CIRCE, Dipartimento di Scienze Ambientali, Seconda Universita degli studi di Napoli and INNOVA, Via Vivaldi, 43, Caserta 81100 (Italy); Department of Soil and Crop Sciences, Colorado State University, Fort Collins, Colorado (United States); Terrasi, Filippo [CIRCE, Dipartimento di Scienze Ambientali, Seconda Universita degli studi di Napoli and INNOVA, Via Vivaldi, 43, Caserta 81100 (Italy)

    2010-04-15

    Soils are studied with the aim to predict future climatic scenarios and find the best guidelines to manage terrestrial ecosystems for the mitigation of the atmospheric CO{sub 2} rising. Carbon constituting soil organic matter (SOM) behaves as a cohort of different pools, characterized by a specific C turnover time. Both natural and anthropogenic occurring {sup 14}C reach the soil through plant littering, becoming a valid tool to trace SOM dynamics. In this study we present a series of Accelerator Mass Spectrometry (AMS) {sup 14}C measurements on SOM samples obtained by means of different laboratory protocols used for the isolation of soil pools from bulk soil (fractionation protocols). Radiocarbon signature of SOM fractions is used as a keyhole to look at the more effective fractionation procedure and comparison among measured {sup 14}C on SOM fractions revealed important indications for the proposal of a novel fractionation protocol. Our data put in evidence how particle size controls the recalcitrance of ancient SOM carbon pools.

  15. Animal Model of Asthma, Various Methods and Measured Parameters: A Methodological Review.

    Science.gov (United States)

    Kianmeher, Majid; Ghorani, Vahideh; Boskabady, Mohammad Hosein

    2016-12-01

    Asthma is a chronic inflammatory disease of the airway with extensive airway remodeling. The ethical issues associated with the studies in asthmatic patients, required development of animal model of asthma. Animal models of asthma can provide valuable information on several features of asthma pathogenesis and treatment. Although these models cannot carry out all clinical features, they are valuable to understand mechanisms of the disease and curative access. Related articles were searched in different databases from September 1994 to April 2016 using; animal model of asthma, animal sensitization, allergen-induced asthma in animals terms. Although there are several reviews on this topic, in the present article, induction of animal model of asthma in different animals, various methods used for this purpose, measured parameters and research purposes were reviewed, which will help investigators to use the appropriate animal, methods, and evaluating parameters depending on their study design. In this study various method used for induction of animal model of asthma in different animals and measured parameters were described, which will help investigators to use the appropriate animal, method and evaluating parameters depending on their study design.

  16. Methodological problem with comparing increases in different measures of body weight

    Directory of Open Access Journals (Sweden)

    Mannan Haider

    2011-05-01

    Full Text Available Abstract Background A number of studies have compared proportional increases over time in waist circumference (WC and body mass index (BMI. However this method is flawed. Here, we explain why comparisons of WC and BMI must take into account the relationship between them. We used data from two cross-sectional US surveys (NHANES 1988-94 and 2005-06, and calculated the percentage change in the average BMI and the average WC between the two surveys, comparing the results with a regression analysis of changes in WC relative to BMI. Findings The crude percentage change in BMI (5.8% was marginally greater than for WC (5.1%. But these percentages cannot be directly compared, as the relationship between the measures is described by a regression equation with an intercept term that does not equal zero. The coefficient of time from the regression equation will determine whether or not WC is on average larger for a given BMI at the second compared with the first time point. Conclusion Differences in the percentage change in WC and the percentage change in BMI cannot be usefully directly compared. Comparisons of increases in the two measures must account for the relationship between them as described by the regression equation.

  17. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  18. Improving Oncology Quality Measurement in Accountable Care: Filling Gaps with Cross-Cutting Measures.

    Science.gov (United States)

    Valuck, Tom; Blaisdell, David; Dugan, Donna P; Westrich, Kimberly; Dubois, Robert W; Miller, Robert S; McClellan, Mark

    2017-02-01

    Payment for health care services, including oncology services, is shifting from volume-based fee-for-service to value-based accountable care. The objective of accountable care is to support providers with flexibility and resources to reform care delivery, accompanied by accountability for maintaining or improving outcomes while lowering costs. These changes depend on health care payers, systems, physicians, and patients having meaningful measures to assess care delivery and outcomes and to balance financial incentives for lowering costs while providing greater value. Gaps in accountable care measure sets may cause missed signals of problems in care and missed opportunities for improvement. Measures to balance financial incentives may be particularly important for oncology, where high cost and increasingly targeted diagnostics and therapeutics intersect with the highly complex and heterogeneous needs and preferences of cancer patients. Moreover, the concept of value in cancer care, defined as the measure of outcomes achieved per costs incurred, is rarely incorporated into performance measurement. This article analyzes gaps in oncology measures in accountable care, discusses challenging measurement issues, and offers strategies for improving oncology measurement. Discern Health analyzed gaps in accountable care measure sets for 10 cancer conditions that were selected based on incidence and prevalence; impact on cost and mortality; a diverse range of high-cost diagnostic procedures and treatment modalities (e.g., genomic tumor testing, molecularly targeted therapies, and stereotactic radiotherapy); and disparities or performance gaps in patient care. We identified gaps by comparing accountable care set measures with high-priority measurement opportunities derived from practice guidelines developed by the National Comprehensive Cancer Network and other oncology specialty societies. We found significant gaps in accountable care measure sets across all 10 conditions. For

  19. Geographic diversification of carbon risk - a methodology for assessing carbon investments using eddy correlation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Hultman, N.E. [Georgetown University, Washington, DC (United States). Intercultural Center

    2006-02-15

    In the context of the international market for greenhouse gas emissions, I examine applications of portfolio theory for investment decisions regarding biological carbon sequestration projects. Using ecosystem-scale eddy correlation carbon flux measurements, I show how to determine how much financial risk of carbon is diversifiable. This method allows a quantitative assessment of the potential for geographical diversification of carbon sink investments. In a case study of six ecosystems in the temperate Northern hemisphere, a significant benefit from diversification is demonstrated even among sites that seem to have broadly similar characteristics. This benefit derives in part from differences in ecosystem response to varying weather conditions and differences in ecosystem type, both of which affect the sites' covariances. In providing a quantitative common language for scientific and corporate uncertainties, the concept of carbon financial risk provides an opportunity for expanding communication between these elements essential to successful climate policy. (author)

  20. Measurement of marine productivity using 15N and 13C tracers: Some methodological aspects

    Indian Academy of Sciences (India)

    Naveen Gandhi; Sanjeev Kumar; S Prakash; R Ramesh; M S Sheshshayee

    2011-02-01

    Various experiments involving the measurement of new, regenerated and total productivity using 15N and 13C tracers were carried out in the Bay of Bengal (BOB) and in the Arabian Sea. Results from 15N tracer experiments indicate that nitrate uptake can be underestimated by experiments with incubation time > 4 hours. Indirect evidence suggests pico- and nano-phytoplankton, on their dominance over microphytoplankton, can also influence the f-ratios. Difference in energy requirement for assimilation of different nitrogen compounds decides the preferred nitrogen source during the early hours of incubation. Variation in light intensity during incubation also plays a significant role in the assimilation of nitrogen. Results from time course experiments with both 15N and 13C tracers suggest that photoinhibition appears significant in BOB and the Arabian Sea during noon. A significant correlation has been found in the productivity values obtained using 15N and 13C tracers.

  1. A Methodology to Integrate Magnetic Resonance and Acoustic Measurements for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Parra, Jorge O.; Hackert, Chris L.; Ni, Qingwen; Collier, Hughbert A.

    2000-09-22

    This report contains eight sections. Some individual subsections contain lists of references as well as figures and conclusions when appropriate. The first section includes the introduction and summary of the first-year project efforts. The next section describes the results of the project tasks: (1) implementation of theoretical relations between effect dispersion and the stochastic medium, (2) imaging analyses using core and well log data, (3) construction of dispersion and attenuation models at the core and borehole scales in poroelastic media, (4) petrophysics and a catalog of core and well log data from Siberia Ridge field, (5) acoustic/geotechnical measurements and CT imaging of core samples from Florida carbonates, and (6) development of an algorithm to predict pore size distribution from NMR core data. The last section includes a summary of accomplishments, technology transfer activities and follow-on work for Phase II.

  2. Cleaner production and methodological proposal of eco-efficiency measurement in a Mexican petrochemical complex.

    Science.gov (United States)

    Morales, M A; Herrero, V M; Martínez, S A; Rodríguez, M G; Valdivieso, E; Garcia, G; de los Angeles Elías, Maria

    2006-01-01

    Abstract In the frame of the Petróleos Mexicanos Institutional Program for Sustainable Development, processes were evaluated in the manufacture operation of the petrochemical industry, with the purpose of reducing their ecological fingerprint. Thirteen cleaner production opportunities were registered in six process plants: ethylene oxide and glycols, acetaldehyde, ethylene, high density polyethylene, polypropylene switch and acrylonitrile, and 45 recommendations in the waste water treatment plant. Morelos is the second most important petrochemical complex in the Mexican and Latin American petrochemical industry. A tool was developed to obtain eco-efficiency indicators in operation processes, and as a result, potential savings were obtained based on best performance, as well as the integrated distribution of Sankey diagrams. Likewise, a mechanism of calculation to obtain economic savings based on the reduction of residues during the whole productive process is proposed. These improvement opportunities and recommendations will result in economic and environmental benefits minimising the use of water, efficient use of energy, raw materials and reducing residues from source, generating less environmental impacts during the process.

  3. Trends in Child Poverty Using an Improved Measure of Poverty.

    Science.gov (United States)

    Wimer, Christopher; Nam, JaeHyun; Waldfogel, Jane; Fox, Liana

    2016-04-01

    The official measure of poverty has been used to assess trends in children's poverty rates for many decades. But because of flaws in official poverty statistics, these basic trends have the potential to be misleading. We use an augmented Current Population Survey data set that calculates an improved measure of poverty to reexamine child poverty rates between 1967 and 2012. This measure, the Anchored Supplemental Poverty Measure, is based partially on the US Census Bureau and Bureau of Labor Statistics' new Supplemental Poverty Measure. We focus on 3 age groups of children, those aged 0 to 5, 6 to 11, and 12 to 17 years. Young children have the highest poverty rates, both historically and today. However, among all age groups, long-term poverty trends have been more favorable than official statistics would suggest. This is entirely due to the effect of counting resources from government policies and programs, which have reduced poverty rates substantially for children of all ages. However, despite this progress, considerable disparities in the risk of poverty continue to exist by education level and family structure.

  4. Method for improving measurement efficiency of lateral shearing interferometry

    Science.gov (United States)

    Li, Jie; Tang, Feng; Wang, Xiangzhao; Dai, Fengzhao; Ding, Lei; Chen, Bo; Yang, Xiaoyu; Chai, Liqun

    2017-02-01

    The computation time of wavefront reconstruction is decreased by sampling the difference fronts in the present study. The wavefront can be reconstructed with high accuracy up to 64 Zernike terms with only 32×32 sampled pixels. Furthermore, the computational efficiency can be improved by a factor of more than 1000, and the measurement efficiency of lateral shearing interferometry is improved. The influence of the terms used to reconstruct the wavefront, the grid size of the test wavefront, the shear ratio, and the random noise on the reconstruction accuracy is analyzed and compared, when the difference fronts are sampled with different grid sizes. Numerical simulations and experiments show that the relative reconstruction error is <5% if the grid size of the sampled difference fronts is more than four times the radial order of difference Zernike polynomials with a reasonable noise level and shear ratio.

  5. Affecting Factors and Improving Measures for Converter Gas Recovery

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To change the undesirable present situation of recovering and using converter gas in steel plants in China,the basic approaches to improving the converter gas recovery rate were analyzed theoretically along with the change curves of the converter gas component content, based on the converter gas recovery system of Baosteel No. 2 steelmaking plant. The effects of converter device, raw material, air imbibed quantity, recovery restricted condition, and intensity of oxygen blowing on the converter gas recovery rate were studied. Among these, the effects of the air imbibed quantity, recovery restricted condition, and intensity of oxygen blowing are remarkable. Comprehensive measures were put forward for improving the converter gas recovery from the point of devices, etc. , and good results were achieved.

  6. Technical measures without enforcement tools: is there any sense? A methodological approach for the estimation of passive net length in small scale fisheries

    Directory of Open Access Journals (Sweden)

    A. LUCCHETTI

    2014-09-01

    Full Text Available Passive nets are currently among the most important fishing gears largely used along the Mediterranean coasts by the small scale fisheries sector. The fishing effort exerted by this sector is strongly correlated with net dimensions. Therefore, the use of passive nets is worldwide managed by defining net length and net drop. The EC Reg. 1967/2006 reports that the length of bottom-set and drifting nets may be also defined considering their weight or volume; however, no practical suggestions for fisheries inspectors are yet available. Consequently,  even if such technical measures are reasonable from a theoretical viewpoint, they are hardly suitable as a management tool, due to the difficulties in harbour control. The overall objective of this paper is to provide a quick methodological approach for the gross estimation of passive net length (by net type on the basis of net volume. The final goal is to support fisheries managers with suitable advice for enforcement and control purposes. The results obtained are important for the management of the fishing effort exerted by small scale fisheries. The methodology developed in this study should be considered as a first attempt to tackle the tangled problem of net length estimation that can be easily applied in other fisheries and areas in order to improve the precision of the models developed herein.

  7. Technical measures without enforcement tools: is there any sense? A methodological approach for the estimation of passive net length in small scale fisheries

    Directory of Open Access Journals (Sweden)

    A. LUCCHETTI

    2015-01-01

    Full Text Available Passive nets are currently among the most important fishing gears largely used along the Mediterranean coasts by the small scale fisheries sector. The fishing effort exerted by this sector is strongly correlated with net dimensions. Therefore, the use of passive nets is worldwide managed by defining net length and net drop. The EC Reg. 1967/2006 reports that the length of bottom-set and drifting nets may be also defined considering their weight or volume; however, no practical suggestions for fisheries inspectors are yet available. Consequently,  even if such technical measures are reasonable from a theoretical viewpoint, they are hardly suitable as a management tool, due to the difficulties in harbour control. The overall objective of this paper is to provide a quick methodological approach for the gross estimation of passive net length (by net type on the basis of net volume. The final goal is to support fisheries managers with suitable advice for enforcement and control purposes. The results obtained are important for the management of the fishing effort exerted by small scale fisheries. The methodology developed in this study should be considered as a first attempt to tackle the tangled problem of net length estimation that can be easily applied in other fisheries and areas in order to improve the precision of the models developed herein.

  8. Quality improvement in neurology: dementia management quality measures.

    Science.gov (United States)

    Odenheimer, Germaine; Borson, Soo; Sanders, Amy E; Swain-Eng, Rebecca J; Kyomen, Helen H; Tierney, Samantha; Gitlin, Laura; Forciea, Mary Ann; Absher, John; Shega, Joseph; Johnson, Jerry

    2014-03-01

    Professional and advocacy organizations have long urged that dementia should be recognized and properly diagnosed. With the passage of the National Alzheimer's Project Act in 2011, an Advisory Council for Alzheimer's Research, Care, and Services was convened to advise the Department of Health and Human Services. In May 2012, the Council produced the first National Plan to address Alzheimer's disease, and prominent in its recommendations is a call for quality measures suitable for evaluating and tracking dementia care in clinical settings. Although other efforts have been made to set dementia care quality standards, such as those pioneered by RAND in its series Assessing Care of Vulnerable Elders (ACOVE), practitioners, healthcare systems, and insurers have not widely embraced implementation. This executive summary (full manuscript available at www.neurology.org) reports on a new measurement set for dementia management developed by an interdisciplinary Dementia Measures Work Group (DWG) representing the major national organizations and advocacy organizations concerned with the care of individuals with dementia. The American Academy of Neurology (AAN), the American Geriatrics Society, the American Medical Directors Association, the American Psychiatric Association, and the American Medical Association-convened Physician Consortium for Performance Improvement led this effort. The ACOVE measures and the measurement set described here apply to individuals whose dementia has already been identified and properly diagnosed. Although similar in concept to ACOVE, the DWG measurement set differs in several important ways; it includes all stages of dementia in a single measure set, calls for the use of functional staging in planning care, prompts the use of validated instruments in patient and caregiver assessment and intervention, highlights the relevance of using palliative care concepts to guide care before the advanced stages of illness, and provides evidence-based support

  9. Development of a cognitive bias methodology for measuring low mood in chimpanzees

    Directory of Open Access Journals (Sweden)

    Melissa Bateson

    2015-06-01

    Full Text Available There is an ethical and scientific need for objective, well-validated measures of low mood in captive chimpanzees. We describe the development of a novel cognitive task designed to measure ‘pessimistic’ bias in judgments of expectation of reward, a cognitive marker of low mood previously validated in a wide range of species, and report training and test data from three common chimpanzees (Pan troglodytes. The chimpanzees were trained on an arbitrary visual discrimination in which lifting a pale grey paper cone was associated with reinforcement with a peanut, whereas lifting a dark grey cone was associated with no reward. The discrimination was trained by sequentially presenting the two cone types until significant differences in latency to touch the cone types emerged, and was confirmed by simultaneously presenting both cone types in choice trials. Subjects were subsequently tested on their latency to touch unrewarded cones of three intermediate shades of grey not previously seen. Pessimism was indicated by the similarity between the latency to touch intermediate cones and the latency to touch the trained, unreinforced, dark grey cones. Three subjects completed training and testing, two adult males and one adult female. All subjects learnt the discrimination (107–240 trials, and retained it during five sessions of testing. There was no evidence that latencies to lift intermediate cones increased over testing, as would have occurred if subjects learnt that these were never rewarded, suggesting that the task could be used for repeated testing of individual animals. There was a significant difference between subjects in their relative latencies to touch intermediate cones (pessimism index that emerged following the second test session, and was not changed by the addition of further data. The most dominant male subject was least pessimistic, and the female most pessimistic. We argue that the task has the potential to be used to assess

  10. Validation of 3-D Ice Accretion Measurement Methodology for Experimental Aerodynamic Simulation

    Science.gov (United States)

    Broeren, Andy P.; Addy, Harold E., Jr.; Lee, Sam; Monastero, Marianne C.

    2015-01-01

    Determining the adverse aerodynamic effects due to ice accretion often relies on dry-air wind-tunnel testing of artificial, or simulated, ice shapes. Recent developments in ice-accretion documentation methods have yielded a laser-scanning capability that can measure highly three-dimensional (3-D) features of ice accreted in icing wind tunnels. The objective of this paper was to evaluate the aerodynamic accuracy of ice-accretion simulations generated from laser-scan data. Ice-accretion tests were conducted in the NASA Icing Research Tunnel using an 18-in. chord, two-dimensional (2-D) straight wing with NACA 23012 airfoil section. For six ice-accretion cases, a 3-D laser scan was performed to document the ice geometry prior to the molding process. Aerodynamic performance testing was conducted at the University of Illinois low-speed wind tunnel at a Reynolds number of 1.8 × 10(exp 6) and a Mach number of 0.18 with an 18-in. chord NACA 23012 airfoil model that was designed to accommodate the artificial ice shapes. The ice-accretion molds were used to fabricate one set of artificial ice shapes from polyurethane castings. The laser-scan data were used to fabricate another set of artificial ice shapes using rapid prototype manufacturing such as stereolithography. The iced-airfoil results with both sets of artificial ice shapes were compared to evaluate the aerodynamic simulation accuracy of the laser-scan data. For five of the six ice-accretion cases, there was excellent agreement in the iced-airfoil aerodynamic performance between the casting and laser-scan based simulations. For example, typical differences in iced-airfoil maximum lift coefficient were less than 3 percent with corresponding differences in stall angle of approximately 1 deg or less. The aerodynamic simulation accuracy reported in this paper has demonstrated the combined accuracy of the laser-scan and rapid-prototype manufacturing approach to simulating ice accretion for a NACA 23012 airfoil. For several

  11. General Methodologies for Improving Motor Drive Precision in Order to Utilize It as an Embedded Application Sensor

    Science.gov (United States)

    Secrest, Caleb W.

    The objectives of this research are to reliably, and non-invasively, extract high quality spatial information from a limited-stroke multi-stage gear train driven by an AC machine using only the sensors necessary for normal AC machine control. In this work, the spatial information will be used to estimate the state of wear of each axis in the gear train. To extract this information, cascaded observer structures are utilized to estimate the load disturbances acting on the AC machine, and then to estimate the spatial errors which produce those disturbances. Further signal processing techniques are then utilized to observe the spatial error information in the spatial domain where the information is most relevant and to extract the systematic spatial errors that reoccur over many operating cycles. In prior art, the intrinsic spatial properties of the AC machine and position sensor feedback limited the quality of the spatial error information that could be extracted using motor drive-based estimation. Furthermore, the limited-stroke of the multi-stage gear train limits the separation of spatial content between the axes of the gear train and limits the extraction of the systematic spatial errors that reoccur over many operating cycles. In this work, methods are developed to reduce and separate machine and sensor contributions to the spatial error information being extracted, and general system design methodologies are investigated to improve the quality of spatial error information that can be extracted within the limited-stroke motion constraints of the multi-stage gear train.

  12. Good practice or positive action? Using Q methodology to identify competing views on improving gender equality in academic medicine.

    Science.gov (United States)

    Bryant, Louise D; Burkinshaw, Paula; House, Allan O; West, Robert M; Ward, Vicky

    2017-08-22

    The number of women entering medicine has increased significantly, yet women are still under-represented at senior levels in academic medicine. To support the gender equality action plan at one School of Medicine, this study sought to (1) identify the range of viewpoints held by staff on how to address gender inequality and (2) identify attitudinal barriers to change. Q methodology. 50 potential interventions representing good practice or positive action, and addressing cultural, organisational and individual barriers to gender equality, were ranked by participants according to their perception of priority. The School of Medicine at the University of Leeds, UK. Fifty-five staff members were purposively sampled to represent gender and academic pay grade. Principal components analysis identified six competing viewpoints on how to address gender inequality. Four viewpoints favoured positive action interventions: (1) support careers of women with childcare commitments, (2) support progression of women into leadership roles rather than focus on women with children, (3) support careers of all women rather than just those aiming for leadership, and (4) drive change via high-level financial and strategic initiatives. Two viewpoints favoured good practice with no specific focus on women by (5) recognising merit irrespective of gender and (6) improving existing career development practice. No viewpoint was strongly associated with gender, pay grade or role; however, latent class analysis identified that female staff were more likely than male to prioritise the setting of equality targets. Attitudinal barriers to the setting of targets and other positive action initiatives were identified, and it was clear that not all staff supported positive action approaches. The findings and the approach have utility for those involved in gender equality work in other medical and academic institutions. However, the impact of such initiatives needs to be evaluated in the longer term.

  13. Reducing measurement scale mismatch to improve surface energy flux estimation

    Science.gov (United States)

    Iwema, Joost; Rosolem, Rafael; Rahman, Mostaquimur; Blyth, Eleanor; Wagener, Thorsten

    2016-04-01

    Soil moisture importantly controls land surface processes such as energy and water partitioning. A good understanding of these controls is needed especially when recognizing the challenges in providing accurate hyper-resolution hydrometeorological simulations at sub-kilometre scales. Soil moisture controlling factors can, however, differ at distinct scales. In addition, some parameters in land surface models are still often prescribed based on observations obtained at another scale not necessarily employed by such models (e.g., soil properties obtained from lab samples used in regional simulations). To minimize such effects, parameters can be constrained with local data from Eddy-Covariance (EC) towers (i.e., latent and sensible heat fluxes) and Point Scale (PS) soil moisture observations (e.g., TDR). However, measurement scales represented by EC and PS still differ substantially. Here we use the fact that Cosmic-Ray Neutron Sensors (CRNS) estimate soil moisture at horizontal footprint similar to that of EC fluxes to help answer the following question: Does reduced observation scale mismatch yield better soil moisture - surface fluxes representation in land surface models? To answer this question we analysed soil moisture and surface fluxes measurements from twelve COSMOS-Ameriflux sites in the USA characterized by distinct climate, soils and vegetation types. We calibrated model parameters of the Joint UK Land Environment Simulator (JULES) against PS and CRNS soil moisture data, respectively. We analysed the improvement in soil moisture estimation compared to uncalibrated model simulations and then evaluated the degree of improvement in surface fluxes before and after calibration experiments. Preliminary results suggest that a more accurate representation of soil moisture dynamics is achieved when calibrating against observed soil moisture and further improvement obtained with CRNS relative to PS. However, our results also suggest that a more accurate

  14. The measurement of the normal thorax using the Haller index methodology at multiple vertebral levels.

    Science.gov (United States)

    Archer, James E; Gardner, Adrian; Berryman, Fiona; Pynsent, Paul

    2016-10-01

    The Haller index is a ratio of thoracic width and height, measured from an axial CT image and used to describe the internal dimensions of the thoracic cage. Although the Haller index for a normal thorax has been established (Haller et al. 1987; Daunt et al. 2004), this is only at one undefined vertebral level in the thorax. What is not clear is how the Haller index describes the thorax at every vertebral level in the absence of sternal deformity, or how this is affected by age. This paper documents the shape of the thorax using the Haller index calculated from the thoracic width and height at all vertebral levels of the thorax between 8 and 18 years of age. The Haller Index changes with vertebral level, with the largest ratio seen in the most cranial levels of the thorax. Increasing age alters the shape of the thorax, with the most cranial vertebral levels having a greater Haller index over the mid thorax, which does not change. A slight increase is seen in the more caudal vertebral levels. These data highlight that a 'one size fits all' rule for chest width and depth ratio at all ages and all thoracic levels is not appropriate. The normal range for width to height ratio should be based on a patient's age and vertebral level. © 2016 Anatomical Society.

  15. Measurement of three-dimensional intra-articular kinematics: methodological and interpretation problems.

    Science.gov (United States)

    Baeyens, J-P; Cattrysse, E; Van Roy, P; Clarys, J-P

    Intra-articular kinematics evaluates joint motion in terms of the configuration of the joint. Therefore data are needed concerning joint kinematics as well as joint configuration. We have developed accurate measurement methods for both in vivo and in vitro evaluation. Interpretation of the processed data is more complex than simply setting up a coordinate system based on the joint configuration. Although the description of intra-articular motion in terms of Euler-Cardan or helical angles may be complete, the therapeutic interpretation may be doubtful. Using the the ulno-humeral joint during flexion-extension as an example, we found the combination of helical angles in the directions of extension/external rotation/varus. In the case of the Cardan angles, inconsistent patterns of rotation resulted from a different choice of sequence order and were different from the helical angles. The finite helical axis (FHA) provides a functional representation of the joint movement, i.e. pathways of motion, whereas the sequence dependency of Euler-Cardan angles produces problems in the therapeutic interpretation of the movement. Therefore we believe that an FHA approach should be used in intra-articular kinematics research.

  16. Binding interaction between sorafenib and calf thymus DNA: Spectroscopic methodology, viscosity measurement and molecular docking

    Science.gov (United States)

    Shi, Jie-Hua; Chen, Jun; Wang, Jing; Zhu, Ying-Yao

    2015-02-01

    The binding interaction of sorafenib with calf thymus DNA (ct-DNA) was studied using UV-vis absorption spectroscopy, fluorescence emission spectroscopy, circular dichroism (CD), viscosity measurement and molecular docking methods. The experimental results revealed that there was obvious binding interaction between sorafenib and ct-DNA. The binding constant (Kb) of sorafenib with ct-DNA was 5.6 × 103 M-1 at 298 K. The enthalpy and entropy changes (ΔH0 and ΔS0) in the binding process of sorafenib with ct-DNA were -27.66 KJ mol-1 and -21.02 J mol-1 K-1, respectively, indicating that the main binding interaction forces were van der Waals force and hydrogen bonding. The docking results suggested that sorafenib preferred to bind on the minor groove of A-T rich DNA and the binding site of sorafenib was 4 base pairs long. The conformation change of sorafenib in the sorafenib-DNA complex was obviously observed and the change was close relation with the structure of DNA, implying that the flexibility of sorafenib molecule played an important role in the formation of the stable sorafenib-ct-DNA complex.

  17. Latest methodological developments for the measurement of diffusion and permeation coefficients in concretes and clays

    Energy Technology Data Exchange (ETDEWEB)

    Berne, P.; Brouard, C.; Pocachard, J. [CEA, FrancCEA/LITEN/LCSN, F-38054 Grenoble cedex 9 (France); Duhart-Barone, A.; Grec, D.; Le Cocguen, A. [CEA/DSN/LECD, F-13108 Cadarache (France)

    2009-07-01

    In water-saturated media the main mode for contaminant transport is liquid transfer, and the confinement capacity of the materials is notably characterized by the effective diffusion coefficient (EDC) of tritiated water. The major problem lies in the duration of experiments, that can exceed several years, so two methods have been explored for the development of accelerated ones. The first consists in a variation of the through-diffusion technique: a given tracer concentration, C{sub 0}, is applied on one face of the sample that has been previously impregnated with a C{sub 0/2} concentration. The duration of the essay can then be divided by 3. The second method involves accelerated migration under the influence of an electric field and direct measurement of the current density. The results are in the same range as the classical through-diffusion experiments, and obtained in about one month. In non water-saturated media, the diffusing fluid of interest is generally the gaseous phase. Two applications at various steps of the nuclear fuel cycle are presented: characterization of the migration of hydrogen in the host rock formation of a geological waste storage, and of the diffusion of tritium gas in the concrete containment structure of decommissioned UNGG nuclear power plants. In both cases the media are close to saturation and the pore water content must be precisely controlled. This paper presents a method which allows to determine the intrinsic permeability and gas diffusion coefficients of the materials. (authors)

  18. Improved Oceanographic Measurements with CryoSat SAR Altimetry

    Science.gov (United States)

    Cotton, David; Benveniste, Jérôme; Cipollini, Paolo; Andersen, Ole; Cancet, Mathilde; Ambrózio, Américo; Restano, Marco; Nilo Garcia, Pablo; Martin, Francisco

    2016-07-01

    The ESA CryoSat mission is the first space mission to carry a radar altimeter that can operate in Synthetic Aperture Radar "SAR" (or delay-Doppler) and interferometric SAR (SARin) modes. Studies on CryoSat data have analysed and confirmed the improved ocean measuring capability offered by SAR mode altimetry, through increased resolution and precision in sea surface height and wave height measurements, and have also added significantly to our understanding of the issues around the processing and interpretation of SAR altimeter echoes. We present work in four themes, building on work initiated in the CryoSat Plus for Oceans project (CP4O), each investigating different aspects of the opportunities offered by this new technology. The first two studies address the coastal zone, a critical region for providing a link between open-ocean and shelf sea measurements with those from coastal in-situ measurements, in particular tide gauges. Although much has been achieved in recent years through the Coastal Altimetry community, (http://www.coastalt.eu/community) there is a limit to the capabilities of pulse-limited altimetry, which often leaves an un-measured "white strip" right at the coastline. Firstly, a thorough analysis was made of the performance of "SAR" altimeter data (delay-Doppler processed) in the coastal zone. This quantified the performance, confirming the significant improvement over "conventional" pulse-limited altimetry. In the second study a processing scheme was developed with CryoSat SARin mode data to enable the retrieval of valid oceanographic measurements in coastal areas with complex topography. Thanks to further development of the algorithms, a new approach was achieved that can also be applied to SAR and conventional altimetry data (e.g., Sentinel-3, Jason series, Envisat). The third part of the project developed and evaluated improvements to the SAMOSA altimeter re-tracker that is implemented in the Sentinel-3 processing chain. The modifications to the

  19. Improved calendar time approach for measuring long-run anomalies

    Directory of Open Access Journals (Sweden)

    Anupam Dutta

    2015-12-01

    Full Text Available Although a large number of recent studies employ the buy-and-hold abnormal return (BHAR methodology and the calendar time portfolio approach to investigate the long-run anomalies, each of the methods is a subject to criticisms. In this paper, we show that a recently introduced calendar time methodology, known as Standardized Calendar Time Approach (SCTA,, controls well for heteroscedasticity problem which occurs in calendar time methodology due to varying portfolio compositions. In addition, we document that SCTA has higher power than the BHAR methodology and the Fama–French three-factor model while detecting the long-run abnormal stock returns. Moreover, when investigating the long-term performance of Canadian initial public offerings, we report that the market period (i.e. the hot and cold period markets does not have any significant impact on calendar time abnormal returns based on SCTA.

  20. Improved Measurement of Ejection Velocities From Craters Formed in Sand

    Science.gov (United States)

    Cintala, Mark J.; Byers, Terry; Cardenas, Francisco; Montes, Roland; Potter, Elliot E.

    2014-01-01

    A typical impact crater is formed by two major processes: compression of the target (essentially equivalent to a footprint in soil) and ejection of material. The Ejection-Velocity Measurement System (EVMS) in the Experimental Impact Laboratory has been used to study ejection velocities from impact craters formed in sand since the late 1990s. The original system used an early-generation Charge-Coupled Device (CCD) camera; custom-written software; and a complex, multicomponent optical system to direct laser light for illumination. Unfortunately, the electronic equipment was overtaken by age, and the software became obsolete in light of improved computer hardware.