WorldWideScience

Sample records for improvement measurement methodology

  1. THE MEASUREMENT METHODOLOGY IMPROVEMENT OF THE HORIZONTAL IRREGULARITIES IN PLAN

    Directory of Open Access Journals (Sweden)

    O. M. Patlasov

    2015-08-01

    Full Text Available Purpose. Across the track superstructure (TSS there are structures where standard approach to the decision on the future of their operation is not entirely correct or acceptable. In particular, it concerns the track sections which are sufficiently quickly change their geometric parameters: the radius of curvature, angle of rotation, and the like. As an example, such portions of TSS may include crossovers where their component is within the so-called connecting part, which at a sufficiently short length, substantially changes curvature. The estimation of the position in terms of a design on the basis of the existing technique (by the difference in the adjacent arrows bending is virtually impossible. Therefore it is proposed to complement and improve the methodology for assessing the situation of the curve in plan upon difference in the adjacent versine. Methodology. The possible options for measuring horizontal curves in the plan were analyzed. The most adequate method, which does not contradict existing on the criterion of the possibility of using established standards was determined. The ease of measurement and calculation was took into account. Findings. Qualitative and quantitative verification of the proposed and existing methods showed very good agreement of the measurement results. This gives grounds to assert that this methodology can be recommended to the workers of track facilities in the assessment of horizontal irregularities in plan not only curves, but also within the connecting part of switch congresses. Originality. The existing method of valuation of the geometric position of the curves in the plan was improved. It does not create new regulations, and all results are evaluated by existing norms. Practical value. The proposed technique makes it possible, without creating a new regulatory framework, to be attached to existing one, and expanding the boundaries of its application. This method can be used not only for ordinary curves

  2. Improving inferior vena cava filter retrieval rates with the define, measure, analyze, improve, control methodology.

    Science.gov (United States)

    Sutphin, Patrick D; Reis, Stephen P; McKune, Angie; Ravanzo, Maria; Kalva, Sanjeeva P; Pillai, Anil K

    2015-04-01

    To design a sustainable process to improve optional inferior vena cava (IVC) filter retrieval rates based on the Define, Measure, Analyze, Improve, Control (DMAIC) methodology of the Six Sigma process improvement paradigm. DMAIC, an acronym for Define, Measure, Analyze, Improve, and Control, was employed to design and implement a quality improvement project to increase IVC filter retrieval rates at a tertiary academic hospital. Retrievable IVC filters were placed in 139 patients over a 2-year period. The baseline IVC filter retrieval rate (n = 51) was reviewed through a retrospective analysis, and two strategies were devised to improve the filter retrieval rate: (a) mailing of letters to clinicians and patients for patients who had filters placed within 8 months of implementation of the project (n = 43) and (b) a prospective automated scheduling of a clinic visit at 4 weeks after filter placement for all new patients (n = 45). The effectiveness of these strategies was assessed by measuring the filter retrieval rates and estimated increase in revenue to interventional radiology. IVC filter retrieval rates increased from a baseline of 8% to 40% with the mailing of letters and to 52% with the automated scheduling of a clinic visit 4 weeks after IVC filter placement. The estimated revenue per 100 IVC filters placed increased from $2,249 to $10,518 with the mailing of letters and to $17,022 with the automated scheduling of a clinic visit. Using the DMAIC methodology, a simple and sustainable quality improvement intervention was devised that markedly improved IVC filter retrieval rates in eligible patients. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  3. Characteristic Rain Events: A Methodology for Improving the Amenity Value of Stormwater Control Measures

    DEFF Research Database (Denmark)

    Smit Andersen, Jonas; Lerer, Sara Maria; Backhaus, Antje

    2017-01-01

    of achieving amenity value is to stage the rainwater and thus bring it to the attention of the public. We present here a methodology for creating a selection of rain events that can help bridge between engineering and landscape architecture when dealing with staging of rainwater. The methodology uses......Local management of rainwater using stormwater control measures (SCMs) is gaining increased attention as a sustainable alternative and supplement to traditional sewer systems. Besides offering added utility values, many SCMs also offer a great potential for added amenity values. One way...... quantitative and statistical methods to select Characteristic Rain Events (CREs) for a range of frequent return periods: weekly, bi-weekly, monthly, bi-monthly, and a single rarer event occurring only every 1–10 years. The methodology for selecting CREs is flexible and can be adjusted to any climatic settings...

  4. Improving process methodology for measuring plutonium burden in human urine using fission track analysis

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Slaughter, D.M.

    1998-01-01

    The aim of this paper is to clearly define the chemical and nuclear principles governing Fission Track Analysis (FTA) to determine environmental levels of 239 Pu in urine. The paper also addresses deficiencies in FTA methodology and introduces improvements to make FTA a more reliable research tool. Our refined methodology, described herein, includes a chemically-induced precipitation phase, followed by anion exchange chromatography and employs a chemical tracer, 236 Pu. We have been able to establish an inverse correlation between Pu recovery and sample volume and our data confirms that increases in sample volume do not result in higher accuracy or lower detection limits. We conclude that in subsequent studies, samples should be limited to approximately two liters. The Pu detection limit for a sample of this volume is 2.8 μBq/l. (author)

  5. Characteristic Rain Events: A Methodology for Improving the Amenity Value of Stormwater Control Measures

    DEFF Research Database (Denmark)

    Smit Andersen, Jonas; Lerer, Sara Maria; Backhaus, Antje

    2017-01-01

    Local management of rainwater using stormwater control measures (SCMs) is gaining increased attention as a sustainable alternative and supplement to traditional sewer systems. Besides offering added utility values, many SCMs also offer a great potential for added amenity values. One way...... of achieving amenity value is to stage the rainwater and thus bring it to the attention of the public. We present here a methodology for creating a selection of rain events that can help bridge between engineering and landscape architecture when dealing with staging of rainwater. The methodology uses......; here we show its use for Danish conditions. We illustrate with a case study how CREs can be used in combination with a simple hydrological model to visualize where, how deep and for how long water is visible in a landscape designed to manage rainwater....

  6. Characteristic Rain Events: A Methodology for Improving the Amenity Value of Stormwater Control Measures

    Directory of Open Access Journals (Sweden)

    Jonas Smit Andersen

    2017-10-01

    Full Text Available Local management of rainwater using stormwater control measures (SCMs is gaining increased attention as a sustainable alternative and supplement to traditional sewer systems. Besides offering added utility values, many SCMs also offer a great potential for added amenity values. One way of achieving amenity value is to stage the rainwater and thus bring it to the attention of the public. We present here a methodology for creating a selection of rain events that can help bridge between engineering and landscape architecture when dealing with staging of rainwater. The methodology uses quantitative and statistical methods to select Characteristic Rain Events (CREs for a range of frequent return periods: weekly, bi-weekly, monthly, bi-monthly, and a single rarer event occurring only every 1–10 years. The methodology for selecting CREs is flexible and can be adjusted to any climatic settings; here we show its use for Danish conditions. We illustrate with a case study how CREs can be used in combination with a simple hydrological model to visualize where, how deep and for how long water is visible in a landscape designed to manage rainwater.

  7. A Case Study of Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC Methodology in Garment Sector

    Directory of Open Access Journals (Sweden)

    Abdur Rahman

    2017-12-01

    Full Text Available This paper demonstrates the empirical application of Six Sigma and Define-Measure-Analyze-Improve-Control (DMAIC methodology to reduce product defects within a garments manufacturing organization in Bangladesh which follows the DMAIC methodology to investigate defects, root causes and provide a solution to eliminate these defects. The analysis from employing Six Sigma and DMAIC indicated that the broken stitch and open seam influenced the number of defective products. Design of experiments (DOE and the analysis of variance (ANOVA techniques were combined to statistically determine the correlation of the broken stitch and open seam with defects as well as to define their optimum values needed to eliminate the defects. Thus, a reduction of about 35% in the garments defect was achieved, which helped the organization studied to reduce its defects and thus improve its Sigma level from 1.7 to 3.4.

  8. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  9. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  10. Radon flux measurement methodologies

    International Nuclear Information System (INIS)

    Nielson, K.K.; Rogers, V.C.

    1984-01-01

    Five methods for measuring radon fluxes are evaluated: the accumulator can, a small charcoal sampler, a large-area charcoal sampler, the ''Big Louie'' charcoal sampler, and the charcoal tent sampler. An experimental comparison of the five flux measurement techniques was also conducted. Excellent agreement was obtained between the measured radon fluxes and fluxes predicted from radium and emanation measurements

  11. Methodological approach to organizational performance improvement process

    OpenAIRE

    Buble, Marin; Dulčić, Želimir; Pavić, Ivan

    2017-01-01

    Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  12. Methodological approach to organizational performance improvement process

    Directory of Open Access Journals (Sweden)

    Marin Buble

    2001-01-01

    Full Text Available Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  13. The impact of methodology in innovation measurement

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, L.; Bugge, M.; Solberg, E.

    2016-07-01

    Innovation surveys and rankings such as the Community Innovation Survey (CIS) and Innovation Union Scoreboard (IUS) have developed into influential diagnostic tools that are often used to categorize countries according to their innovation performance and to legitimise innovation policies. Although a number of ongoing processes are seeking to improve existing frameworks for measuring innovation, there are large methodological differences across countries in the way innovation is measured. This causes great uncertainty regarding a) the coherence between data from innovation surveys, b) actual innovativeness of the economy, and c) the validity of research based on innovation data. Against this background we explore empirically how different survey methods for measuring innovation affect reported innovation performance. The analysis is based on a statistical exercise comparing the results from three different methodological versions of the same survey for measuring innovation in the business enterprise sector in Norway. We find striking differences in reported innovation performance depending on how the surveys are carried out methodologically. The paper concludes that reported innovation performance is highly sensitive to and strongly conditioned by methodological context. This represents a need for increased caution and awareness around data collection and research based on innovation data, and not least in terms of aggregation of data and cross-country comparison. (Author)

  14. Improving Learning Outcome Using Six Sigma Methodology

    Science.gov (United States)

    Tetteh, Godson A.

    2015-01-01

    Purpose: The purpose of this research paper is to apply the Six Sigma methodology to identify the attributes of a lecturer that will help improve a student's prior knowledge of a discipline from an initial "x" per cent knowledge to a higher "y" per cent of knowledge. Design/methodology/approach: The data collection method…

  15. Performance improvement using methodology: case study.

    Science.gov (United States)

    Harmelink, Stacy

    2008-01-01

    The department of radiology at St. Luke's Regional Medical Center in Sioux City, IA implemented meaningful workflow changes for reducing patient wait times and, at the same time, improved customer and employee satisfaction scores. Lean methodology and the 7 Deadly Wastes, along with small group interaction, was used to evaluate and change the process of a patient waiting for an exam in the radiology department. The most important key to the success of a performance improvement project is the involvement of staff.

  16. A methodology for combining multiple commercial data sources to improve measurement of the food and alcohol environment: applications of geographical information systems

    Directory of Open Access Journals (Sweden)

    Dara D. Mendez

    2014-11-01

    Full Text Available Commercial data sources have been increasingly used to measure and locate community resources. We describe a methodology for combining and comparing the differences in commercial data of the food and alcohol environment. We used commercial data from two commercial databases (InfoUSA and Dun&Bradstreet for 2003 and 2009 to obtain infor- mation on food and alcohol establishments and developed a matching process using computer algorithms and manual review by applying ArcGIS to geocode addresses, standard industrial classification and North American industry classification tax- onomy for type of establishment and establishment name. We constructed population and area-based density measures (e.g. grocery stores and assessed differences across data sources and used ArcGIS to map the densities. The matching process resulted in 8,705 and 7,078 unique establishments for 2003 and 2009, respectively. There were more establishments cap- tured in the combined dataset than relying on one data source alone, and the additional establishments captured ranged from 1,255 to 2,752 in 2009. The correlations for the density measures between the two data sources was highest for alcohol out- lets (r = 0.75 and 0.79 for per capita and area, respectively and lowest for grocery stores/supermarkets (r = 0.32 for both. This process for applying geographical information systems to combine multiple commercial data sources and develop meas- ures of the food and alcohol environment captured more establishments than relying on one data source alone. This replic- able methodology was found to be useful for understanding the food and alcohol environment when local or public data are limited.

  17. Methodology of quality improvement projects for the Texas Medicare population.

    Science.gov (United States)

    Pendergrass, P W; Abel, R L; Bing, M; Vaughn, R; McCauley, C

    1998-07-01

    The Texas Medical Foundation, the quality improvement organization for the state of Texas, develops local quality improvement projects for the Medicare population. These projects are developed as part of the Health Care Quality Improvement Program undertaken by the Health Care Financing Administration. The goal of a local quality improvement project is to collaborate with providers to identify and reduce the incidence of unintentional variations in the delivery of care that negatively impact outcomes. Two factors are critical to the success of a quality improvement project. First, as opposed to peer review that is based on implicit criteria, quality improvement must be based on explicit criteria. These criteria represent key steps in the delivery of care that have been shown to improve outcomes for a specific disease. Second, quality improvement must be performed in partnership with the health care community. As such, the health care community must play an integral role in the design and evaluation of a quality improvement project and in the design and implementation of the resulting quality improvement plan. Specifically, this article provides a historical perspective for the transition from peer review to quality improvement. It discusses key steps used in developing and implementing local quality improvement projects including topic selection, quality indicator development, collaborator recruitment, and measurement of performance/improvement. Two Texas Medical Foundation projects are described to highlight the current methodology and to illustrate the impact of quality improvement projects.

  18. Towards methodological improvement in the Spanish studies

    Directory of Open Access Journals (Sweden)

    Beatriz Amante García

    2012-09-01

    Full Text Available The European Higher Education Area (EHEA has triggered many changes in the new degrees in Spanish universities, mainly in terms of methodology and assessment. However, in order to make such changes a success it is essential to have coordination within the teaching staff as well as active methodologies in use, which enhance and encourage students’ participation in all the activities carried out in the classroom. Most of all, when dealing with formative and summative evaluation, in which students become the ones responsible for their own learning process (López-Pastor, 2009; Torre, 2008. In this second issue of JOTSE we have included several teaching innovation experiences related to the above mentioned methodological and assessment changes.

  19. How Six Sigma Methodology Improved Doctors' Performance

    Science.gov (United States)

    Zafiropoulos, George

    2015-01-01

    Six Sigma methodology was used in a District General Hospital to assess the effect of the introduction of an educational programme to limit unnecessary admissions. The performance of the doctors involved in the programme was assessed. Ishikawa Fishbone and 5 S's were initially used and Pareto analysis of their findings was performed. The results…

  20. Knowledge Management Methodologies for Improving Safety Culture

    International Nuclear Information System (INIS)

    Rusconi, C.

    2016-01-01

    Epistemic uncertainties could affect operator’s capability to prevent rare but potentially catastrophic accident sequences. Safety analysis methodologies are powerful but fragile tools if basic assumptions are not sound and exhaustive. In particular, expert judgments and technical data could be invalidated by organizational context change (e.g., maintenance planning, supply systems etc.) or by unexpected events. In 1986 accidents like Chernobyl, the explosion of Shuttle Challenger and, two years before, the toxic release at Bhopal chemical plant represented the point of no return with respect to the previous vision of safety and highlighted the undelayable need to change paradigm and face safety issues in complex systems not only from a technical point of view but to adopt a systemic vision able to include and integrate human and organizational aspects.

  1. Improved USGS methodology for assessing continuous petroleum resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy A.

    2010-01-01

    This report presents an improved methodology for estimating volumes of continuous (unconventional) oil and gas resources within the United States and around the world. The methodology is based on previously developed U.S. Geological Survey methodologies that rely on well-scale production data. Improvements were made primarily to how the uncertainty about estimated ultimate recoveries is incorporated in the estimates. This is particularly important when assessing areas with sparse or no production data, because the new methodology allows better use of analog data from areas with significant discovery histories.

  2. Improvement of personalized Monte Carlo-aided direct internal contamination monitoring: optimization of calculation times and measurement methodology for the establishment of activity distribution

    International Nuclear Information System (INIS)

    Farah, Jad

    2011-01-01

    To optimize the monitoring of female workers using in vivo spectrometry measurements, it is necessary to correct the typical calibration coefficients obtained with the Livermore male physical phantom. To do so, numerical calibrations based on the use of Monte Carlo simulations combined with anthropomorphic 3D phantoms were used. Such computational calibrations require on the one hand the development of representative female phantoms of different size and morphologies and on the other hand rapid and reliable Monte Carlo calculations. A library of female torso models was hence developed by fitting the weight of internal organs and breasts according to the body height and to relevant plastic surgery recommendations. This library was next used to realize a numerical calibration of the AREVA NC La Hague in vivo counting installation. Moreover, the morphology-induced counting efficiency variations with energy were put into equation and recommendations were given to correct the typical calibration coefficients for any monitored female worker as a function of body height and breast size. Meanwhile, variance reduction techniques and geometry simplification operations were considered to accelerate simulations. Furthermore, to determine the activity mapping in the case of complex contaminations, a method that combines Monte Carlo simulations with in vivo measurements was developed. This method consists of realizing several spectrometry measurements with different detector positioning. Next, the contribution of each contaminated organ to the count is assessed from Monte Carlo calculations. The in vivo measurements realized at LEDI, CIEMAT and KIT have demonstrated the effectiveness of the method and highlighted the valuable contribution of Monte Carlo simulations for a more detailed analysis of spectrometry measurements. Thus, a more precise estimate of the activity distribution is given in the case of an internal contamination. (author)

  3. Stochastic Optimization of Supply Chain Risk Measures –a Methodology for Improving Supply Security of Subsidized Fuel Oil in Indonesia

    Directory of Open Access Journals (Sweden)

    Adinda Yuanita

    2015-08-01

    Full Text Available Monte Carlo simulation-based methods for stochastic optimization of risk measures is required to solve complex problems in supply security of subsidized fuel oil in Indonesia. In order to overcome constraints in distribution of subsidized fuel in Indonesia, which has the fourth largest population in the world—more than 250,000,000 people with 66.5% of productive population, and has more than 17,000 islands with its population centered around the nation's capital only—it is necessary to have a measurable and integrated risk analysis with monitoring system for the purpose of supply security of subsidized fuel. In consideration of this complex issue, uncertainty and probability heavily affected this research. Therefore, this research did the Monte Carlo sampling-based stochastic simulation optimization with the state-of-the-art "FIRST" parameter combined with the Sensitivity Analysis to determine the priority of integrated risk mitigation handling so that the implication of the new model design from this research may give faster risk mitigation time. The results of the research identified innovative ideas of risk based audit on supply chain risk management and new FIRST (Fairness, Independence, Reliable, Sustainable, Transparent parameters on risk measures. In addition to that, the integration of risk analysis confirmed the innovative level of priority on sensitivity analysis. Moreover, the findings showed that the new risk mitigation time was 60% faster than the original risk mitigation time.

  4. Stochastic Optimization of Supply Chain Risk Measures –a Methodology for Improving Supply Security of Subsidized Fuel Oil in Indonesia

    OpenAIRE

    Adinda Yuanita; Andi Noorsaman Sommeng; Anondho Wijonarko

    2015-01-01

    Monte Carlo simulation-based methods for stochastic optimization of risk measures is required to solve complex problems in supply security of subsidized fuel oil in Indonesia. In order to overcome constraints in distribution of subsidized fuel in Indonesia, which has the fourth largest population in the world—more than 250,000,000 people with 66.5% of productive population, and has more than 17,000 islands with its population centered around the nation's capital only—it is necessary to have a...

  5. Lightweight methodology to improve web accessibility

    CSIR Research Space (South Africa)

    Greeff, M

    2009-10-01

    Full Text Available to improve score. Colour Contrast Fujitsu ColorSelector [9] Each colour combination has to be selected manually. Didn’t identify colour contrast problems that were highlighted by the other two tools. JuicyStudio Colour Contrast Analyser Firefox..., but this is not tested by AccessKeys AccessColor. However, AccessKeys AccessColor provides a link to the specific line in the code where the problem occurs. This is not provided by JuicyStudio Colour Contrast Analyser. According to these two tools, many colour...

  6. Using Six Sigma and Lean methodologies to improve OR throughput.

    Science.gov (United States)

    Fairbanks, Catharine B

    2007-07-01

    Improving patient flow in the perioperative environment is challenging, but it has positive implications for both staff members and for the facility. One facility in vermont improved patient throughput by incorporating Six Sigma and Lean methodologies for patients undergoing elective procedures. The results of the project were significantly improved patient flow and increased teamwork and pride among perioperative staff members. (c) AORN, Inc, 2007.

  7. [Improving inpatient pharmacoterapeutic process by Lean Six Sigma methodology].

    Science.gov (United States)

    Font Noguera, I; Fernández Megía, M J; Ferrer Riquelme, A J; Balasch I Parisi, S; Edo Solsona, M D; Poveda Andres, J L

    2013-01-01

    Lean Six Sigma methodology has been used to improve care processes, eliminate waste, reduce costs, and increase patient satisfaction. To analyse the results obtained with Lean Six Sigma methodology in the diagnosis and improvement of the inpatient pharmacotherapy process during structural and organisational changes in a tertiary hospital. 1.000 beds tertiary hospital. prospective observational study. The define, measure, analyse, improve and control (DMAIC), were deployed from March to September 2011. An Initial Project Charter was updated as results were obtained. 131 patients with treatments prescribed within 24h after admission and with 4 drugs. safety indicators (medication errors), and efficiency indicators (complaints and time delays). Proportion of patients with a medication error was reduced from 61.0% (25/41 patients) to 55.7% (39/70 patients) in four months. Percentage of errors (regarding the opportunities for error) decreased in the different phases of the process: Prescription: from 5.1% (19/372 opportunities) to 3.3% (19/572 opportunities); Preparation: from 2.7% (14/525 opportunities) to 1.3% (11/847 opportunities); and administration: from 4.9% (16/329 opportunities) to 3.0% (13/433 opportunities). Nursing complaints decreased from 10.0% (2119/21038 patients) to 5.7% (1779/31097 patients). The estimated economic impact was 76,800 euros saved. An improvement in the pharmacotherapeutic process and a positive economic impact was observed, as well as enhancing patient safety and efficiency of the organization. Standardisation and professional training are future Lean Six Sigma candidate projects. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.

  8. Improving Training in Methodology Enriches the Science of Psychology

    Science.gov (United States)

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2009-01-01

    Replies to the comment Ramifications of increased training in quantitative methodology by Herbet Zimiles on the current authors original article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America". The…

  9. Improving precipitation measurement

    Science.gov (United States)

    Strangeways, Ian

    2004-09-01

    Although rainfall has been measured for centuries scientifically and in isolated brief episodes over millennia for agriculture, it is still not measured adequately even today for climatology, water resources, and other precise applications. This paper outlines the history of raingauges, their errors, and describes the field testing over 3 years of a first guess design for an aerodynamic rain collector proposed by Folland in 1988. Although shown to have aerodynamic advantage over a standard 5 gauge, the new rain collector was found to suffer from outsplash in heavy rain. To study this problem, and to derive general basic design rules for aerodynamic gauges, its performance was investigated in turbulent, real-world conditions rather than in the controlled and simplified environment of a wind tunnel or mathematical model as in the past. To do this, video records were made using thread tracers to indicate the path of the wind, giving new insight into the complex flow of natural wind around and within raingauges. A new design resulted, and 2 years of field testing have shown that the new gauge has good aerodynamic and evaporative characteristics and minimal outsplash, offering the potential for improved precipitation measurement.

  10. Development of Six Sigma methodology for CNC milling process improvements

    Science.gov (United States)

    Ismail, M. N.; Rose, A. N. M.; Mohammed, N. Z.; Rashid, M. F. F. Ab

    2017-10-01

    Quality and productivity have been identified as an important role in any organization, especially for manufacturing sectors to gain more profit that leads to success of a company. This paper reports a work improvement project in Kolej Kemahiran Tinggi MARA Kuantan. It involves problem identification in production of “Khufi” product and proposing an effective framework to improve the current situation effectively. Based on the observation and data collection on the work in progress (WIP) product, the major problem has been identified related to function of the product which is the parts can’t assemble properly due to dimension of the product is out of specification. The six sigma has been used as a methodology to study and improve of the problems identified. Six Sigma is a highly statistical and data driven approach to solving complex business problems. It uses a methodical five phase approach define, measure, analysis, improve and control (DMAIC) to help understand the process and the variables that affect it so that can be optimized the processes. Finally, the root cause and solution for the production of “Khufi” problem has been identified and implemented then the result for this product was successfully followed the specification of fitting.

  11. A Dynamic Methodology for Improving the Search Experience

    Directory of Open Access Journals (Sweden)

    Marcia D. Kerchner

    2006-06-01

    Full Text Available In the early years of modern information retrieval, the fundamental way in which we understood and evaluated search performance was by measuring precision and recall. In recent decades, however, models of evaluation have expanded to incorporate the information-seeking task and the quality of its outcome, as well as the value of the information to the user. We have developed a systems engineering-based methodology for improving the whole search experience. The approach focuses on understanding users’ information-seeking problems, understanding who has the problems, and applying solutions that address these problems. This information is gathered through ongoing analysis of site-usage reports, satisfaction surveys, Help Desk reports, and a working relationship with the business owners.

  12. Methodology for measurement in schools and kindergartens: experiences

    International Nuclear Information System (INIS)

    Fotjikova, I.; Navratilova Rovenska, K.

    2015-01-01

    In more than 1500 schools and preschool facilities, long-term radon measurement was carried out in the last 3 y. The negative effect of thermal retrofitting on the resulting long-term radon averages is evident. In some of the facilities, low ventilation rates and correspondingly high radon levels were found, so it was recommended to change ventilation habits. However, some of the facilities had high radon levels due to its ingress from soil gas. Technical measures should be undertaken to reduce radon exposure in this case. The paper presents the long-term experiences with the two-stage measurement methodology for investigation of radon levels in school and preschool facilities and its possible improvements. (authors)

  13. Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities

    International Nuclear Information System (INIS)

    Batandjieva, B.; Torres-Vidal, C.

    2002-01-01

    The International Atomic Energy Agency (IAEA) Coordinated research program ''Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities'' (ISAM) has developed improved safety assessment methodology for near surface disposal facilities. The program has been underway for three years and has included around 75 active participants from 40 countries. It has also provided examples for application to three safety cases--vault, Radon type and borehole radioactive waste disposal facilities. The program has served as an excellent forum for exchange of information and good practices on safety assessment approaches and methodologies used worldwide. It also provided an opportunity for reaching broad consensus on the safety assessment methodologies to be applied to near surface low and intermediate level waste repositories. The methodology has found widespread acceptance and the need for its application on real waste disposal facilities has been clearly identified. The ISAM was finalized by the end of 2000, working material documents are available and an IAEA report will be published in 2002 summarizing the work performed during the three years of the program. The outcome of the ISAM program provides a sound basis for moving forward to a new IAEA program, which will focus on practical application of the safety assessment methodologies to different purposes, such as licensing radioactive waste repositories, development of design concepts, upgrading existing facilities, reassessment of operating repositories, etc. The new program will also provide an opportunity for development of guidance on application of the methodology that will be of assistance to both safety assessors and regulators

  14. Methodological Challenges in Measuring Child Maltreatment

    Science.gov (United States)

    Fallon, Barbara; Trocme, Nico; Fluke, John; MacLaurin, Bruce; Tonmyr, Lil; Yuan, Ying-Ying

    2010-01-01

    Objective: This article reviewed the different surveillance systems used to monitor the extent of reported child maltreatment in North America. Methods: Key measurement and definitional differences between the surveillance systems are detailed and their potential impact on the measurement of the rate of victimization. The infrastructure…

  15. Relative Hazard and Risk Measure Calculation Methodology

    International Nuclear Information System (INIS)

    Stenner, Robert D.; Strenge, Dennis L.; Elder, Matthew S.; Andrews, William B.; Walton, Terry L.

    2003-01-01

    The RHRM equations, as represented in methodology and code presented in this report, are primarily a collection of key factors normally used in risk assessment that are relevant to understanding the hazards and risks associated with projected mitigation, cleanup, and risk management activities. The RHRM code has broad application potential. For example, it can be used to compare one mitigation, cleanup, or risk management activity with another, instead of just comparing it to just the fixed baseline. If the appropriate source term data are available, it can be used in its non-ratio form to estimate absolute values of the associated controlling hazards and risks. These estimated values of controlling hazards and risks can then be examined to help understand which mitigation, cleanup, or risk management activities are addressing the higher hazard conditions and risk reduction potential at a site. Graphics can be generated from these absolute controlling hazard and risk values to graphically compare these high hazard and risk reduction potential conditions. If the RHRM code is used in this manner, care must be taken to specifically define and qualify (e.g., identify which factors were considered and which ones tended to drive the hazard and risk estimates) the resultant absolute controlling hazard and risk values

  16. A methodology to measure the degre of managerial innovation

    OpenAIRE

    Ayhan, Mustafa Batuhan; Oztemel, Ercan

    2014-01-01

    Purpose: The main objective of this study is to introduce the concept of managerial innovation and to propose a quantitative methodology to measure the degree of managerial innovation capability by analyzing the evolution of the techniques used for management functions.Design/methodology/approach: The methodology mainly focuses on the different techniques used for each management functions namely; Planning, Organizing, Leading, Controlling and Coordinating. These functions are studied and the...

  17. Improvement by GQM measurement

    NARCIS (Netherlands)

    Solingen, van D.M.; Veenendaal, van E.P.W.M.; Trienekens, J.; Veenendaal, van E.

    1997-01-01

    Software development is a discipline with specific management difficulties. Collecting relevant data during development is a way to overcome these difficulties. Such data collectionfor software development is termed ‘Software Measurement’. Software measurement is a poweiful aid to quality

  18. A Soft Systems Methodology Perspective on Data Warehousing Education Improvement

    OpenAIRE

    R. Goede; E. Taylor

    2012-01-01

    This paper demonstrates how the soft systems methodology can be used to improve the delivery of a module in data warehousing for fourth year information technology students. Graduates in information technology needs to have academic skills but also needs to have good practical skills to meet the skills requirements of the information technology industry. In developing and improving current data warehousing education modules one has to find a balance in meeting the expectations of ...

  19. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  20. Guide for prioritizing power plant productivity improvement projects: handbook of availability improvement methodology

    International Nuclear Information System (INIS)

    1981-01-01

    As part of its program to help improve electrical power plant productivity, the Department of Energy (DOE) has developed a methodology for evaluating productivity improvement projects. This handbook presents a simplified version of this methodology called the Availability Improvement Methodology (AIM), which provides a systematic approach for prioritizing plant improvement projects. Also included in this handbook is a description of data taking requirements necessary to support the AIM methodology, benefit/cost analysis, and root cause analysis for tracing persistent power plant problems. In applying the AIM methodology, utility engineers should be mindful that replacement power costs are frequently greater for forced outages than for planned outages. Equivalent availability includes both. A cost-effective ranking of alternative plant improvement projects must discern between those projects which will reduce forced outages and those which might reduce planned outages. As is the case with any analytical procedure, engineering judgement must be exercised with respect to results of purely mathematical calculations

  1. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    Energy Technology Data Exchange (ETDEWEB)

    Tarifeño-Saldivia, Ariel, E-mail: atarifeno@cchen.cl, E-mail: atarisal@gmail.com; Pavez, Cristian; Soto, Leopoldo [Comisión Chilena de Energía Nuclear, Casilla 188-D, Santiago (Chile); Center for Research and Applications in Plasma Physics and Pulsed Power, P4, Santiago (Chile); Departamento de Ciencias Fisicas, Facultad de Ciencias Exactas, Universidad Andres Bello, Republica 220, Santiago (Chile); Mayer, Roberto E. [Instituto Balseiro and Centro Atómico Bariloche, Comisión Nacional de Energía Atómica and Universidad Nacional de Cuyo, San Carlos de Bariloche R8402AGP (Argentina)

    2014-01-15

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  2. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    International Nuclear Information System (INIS)

    Tarifeño-Saldivia, Ariel; Pavez, Cristian; Soto, Leopoldo; Mayer, Roberto E.

    2014-01-01

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods

  3. Improvements in measuring apparatus

    International Nuclear Information System (INIS)

    Casey, W.

    1976-01-01

    Measuring apparatus is described that is suitable for gauging the wall profiles of downwardly extending channels in nuclear reactors, but which is equally applicable to channels such as pipe bores and conduits in other types of plant. The apparatus comprises a probe carrying a measuring transducer giving an electrical output. The probe support may be moved stepwise along the channel along a track between end members. An electrical conductor is provided for transmitting the electrical output of the transducer to an indicator located remote from the probe. The probe support may consist of a cable attached at one end to a winding drum, and incorporating an electrical conductor connected to the transducer. Channel engaging means are provided on the probe that permits free upward movement of the probe when the latter is suspended by the cable and moves into gripping engagement with the channel wall when the tension in the cable is relaxed. (U.K.)

  4. Promoting Continuous Quality Improvement in the Alabama Child Health Improvement Alliance Through Q-Sort Methodology and Learning Collaboratives.

    Science.gov (United States)

    Fifolt, Matthew; Preskitt, Julie; Rucks, Andrew; Corvey, Kathryn; Benton, Elizabeth Cason

    Q-sort methodology is an underutilized tool for differentiating among multiple priority measures. The authors describe steps to identify, delimit, and sort potential health measures and use selected priority measures to establish an overall agenda for continuous quality improvement (CQI) activities within learning collaboratives. Through an iterative process, the authors vetted a list of potential child and adolescent health measures. Multiple stakeholders, including payers, direct care providers, and organizational representatives sorted and prioritized measures, using Q-methodology. Q-methodology provided the Alabama Child Health Improvement Alliance (ACHIA) an objective and rigorous approach to system improvement. Selected priority measures were used to design learning collaboratives. An open dialogue among stakeholders about state health priorities spurred greater organizational buy-in for ACHIA and increased its credibility as a statewide provider of learning collaboratives. The integrated processes of Q-sort methodology, learning collaboratives, and CQI offer a practical yet innovative way to identify and prioritize state measures for child and adolescent health and establish a learning agenda for targeted quality improvement activities.

  5. Improve Internal Audit Methodology in the Case Company

    OpenAIRE

    Hong Trang Nguyen, Thi

    2016-01-01

    The purpose of this study was to identify improvement areas in the internal audit methodology used by the Internal Audit team at the case company which is the local subsidiary of a global financial group. The Internal Audit activity of the case company has been recently evaluated by the Institute of Internal Auditors. The overall quality assessment concludes that the Internal Audit activity has a charter, policies and processes that are in conformance with the Mandatory Guidance of the Intern...

  6. IMPROVING METHODOLOGY OF RISK IDENTIFICATION OF OCCUPATIONAL DANGEROUS

    Directory of Open Access Journals (Sweden)

    A. P. BOCHKOVSKYI

    2018-04-01

    Full Text Available In the paper, according to the analysis of statistical data, correlation between the amount of occupational injuries and occupationaldiseases in Ukraine within last 5 years is defined. Also, using methodology of the International Labor Organizationcorrelcation between the amount of accident fatalities and general number of accidents in Ukraine and EU countries (Austria, GreatBritain, Germany, Denmark, Norway, Poland, Hungry, Finland, France is defined. It is shown that in spite of the positive dynamicsof decreasing amount of occupational injuries, the number of occupational diseases in Ukraine always increases. The comparativeanalysis of the ratio of the number of accident fatalities to the total number of registered accidents showed that, on average, Ukraineexceeds the EU countries by this indicator by 100 times.It is noted, that such negative indicators (in particular, increasing amount of occupational diseases, may occure because ofimperfect methodology for identifying the risks of professional dangerous.Also, it is ascertained that basing on the existed methodology, the identefication process of occupational dangerous isquite subjective, which reduces objectivity of conducting quantitative assessment. In order to eliminate defined drawnbacks it is firsttime proposed to use corresponding integral criterion to conduct the process of quantitative risk assessmentTo solve this problem authors formulate and propose an algorithm of improving methodology of a process of analysing dangerousand harmful production effects (DHPE which are the mainest reasons of occupational dangerous.The proposed algorithm includes implementation of four following successive steps: DHPE identification, indication of theirmaximum allowed threshold of concentrations (levels, identification of the sources of identified DHPE, esimation of consequencesof manifestation.The improved proposed methodology allows indentify risks of occurrence occupational dangerous in systems

  7. Methodology for performing measurements to release material from radiological control

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1993-09-01

    This report describes the existing and proposed methodologies for performing measurements of contamination prior to releasing material for uncontrolled use at the Hanford Site. The technical basis for the proposed methodology, a modification to the existing contamination survey protocol, is also described. The modified methodology, which includes a large-area swipe followed by a statistical survey, can be used to survey material that is unlikely to be contaminated for release to controlled and uncontrolled areas. The material evaluation procedure that is used to determine the likelihood of contamination is also described

  8. Methodologies for Improved Tag Cloud Generation with Clustering

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2012-01-01

    Tag clouds are useful means for navigation in the social web systems. Usually the systems implement the tag cloud generation based on tag popularity which is not always the best method. In this paper we propose methodologies on how to combine clustering into the tag cloud generation to improve...... coverage and overlap. We study several clustering algorithms to generate tag clouds. We show that by extending cloud generation based on tag popularity with clustering we slightly improve coverage. We also show that if the cloud is generated by clustering independently of the tag popularity baseline we...

  9. Improved FTA methodology and application to subsea pipeline reliability design.

    Science.gov (United States)

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  10. Application of Bow-tie methodology to improve patient safety.

    Science.gov (United States)

    Abdi, Zhaleh; Ravaghi, Hamid; Abbasi, Mohsen; Delgoshaei, Bahram; Esfandiari, Somayeh

    2016-05-09

    Purpose - The purpose of this paper is to apply Bow-tie methodology, a proactive risk assessment technique based on systemic approach, for prospective analysis of the risks threatening patient safety in intensive care unit (ICU). Design/methodology/approach - Bow-tie methodology was used to manage clinical risks threatening patient safety by a multidisciplinary team in the ICU. The Bow-tie analysis was conducted on incidents related to high-alert medications, ventilator associated pneumonia, catheter-related blood stream infection, urinary tract infection, and unwanted extubation. Findings - In total, 48 potential adverse events were analysed. The causal factors were identified and classified into relevant categories. The number and effectiveness of existing preventive and protective barriers were examined for each potential adverse event. The adverse events were evaluated according to the risk criteria and a set of interventions were proposed with the aim of improving the existing barriers or implementing new barriers. A number of recommendations were implemented in the ICU, while considering their feasibility. Originality/value - The application of Bow-tie methodology led to practical recommendations to eliminate or control the hazards identified. It also contributed to better understanding of hazard prevention and protection required for safe operations in clinical settings.

  11. A methodology to measure the degre of managerial innovation

    Directory of Open Access Journals (Sweden)

    Mustafa Batuhan Ayhan

    2014-01-01

    Full Text Available Purpose: The main objective of this study is to introduce the concept of managerial innovation and to propose a quantitative methodology to measure the degree of managerial innovation capability by analyzing the evolution of the techniques used for management functions.Design/methodology/approach: The methodology mainly focuses on the different techniques used for each management functions namely; Planning, Organizing, Leading, Controlling and Coordinating. These functions are studied and the different techniques used for them are listed. Since the techniques used for these management functions evolve in time due to technological and social changes, a methodology is required to measure the degree of managerial innovation capability. This competency is measured through an analysis performed to point out which techniques used for each of these functions.Findings: To check the validity and applicability of this methodology, it is implemented to a manufacturing company. Depending on the results of the implementation, enhancements are suggested to the company for each function to survive in the changing managerial conditionsResearch limitations/implications: The primary limitation of this study is the implementation area. Although the study is implemented in just a single manufacturing company, it is welcomed to apply the same methodology to measure the managerial innovation capabilities of other manufacturing companies. Moreover, the model is ready to be adapted to different sectors although it is mainly prepared for manufacturing sector.Originality/value: Although innovation management is widely studied, managerial innovation is a new concept and introduced to measure the capability to challenge the changes occur in managerial functions. As a brief this methodology aims to be a pioneer in the field of managerial innovation regarding the evolution of management functions. Therefore it is expected to lead more studies to inspect the progress of

  12. QUALITY IMPROVEMENT IN MULTIRESPONSE EXPERIMENTS THROUGH ROBUST DESIGN METHODOLOGY

    Directory of Open Access Journals (Sweden)

    M. Shilpa

    2012-06-01

    Full Text Available Robust design methodology aims at reducing the variability in the product performance in the presence of noise factors. Experiments involving simultaneous optimization of more than one quality characteristic are known as multiresponse experiments which are used in the development and improvement of industrial processes and products. In this paper, robust design methodology is applied to optimize the process parameters during a particular operation of rotary driving shaft manufacturing process. The three important quality characteristics of the shaft considered here are of type Nominal-the-best, Smaller-the-better and Fraction defective. Simultaneous optimization of these responses is carried out by identifying the control parameters and conducting the experimentation using L9 orthogonal array.

  13. Is mindfulness research methodology improving over time? A systematic review.

    Directory of Open Access Journals (Sweden)

    Simon B Goldberg

    Full Text Available Despite an exponential growth in research on mindfulness-based interventions, the body of scientific evidence supporting these treatments has been criticized for being of poor methodological quality.The current systematic review examined the extent to which mindfulness research demonstrated increased rigor over the past 16 years regarding six methodological features that have been highlighted as areas for improvement. These feature included using active control conditions, larger sample sizes, longer follow-up assessment, treatment fidelity assessment, and reporting of instructor training and intent-to-treat (ITT analyses.We searched PubMed, PsychInfo, Scopus, and Web of Science in addition to a publically available repository of mindfulness studies.Randomized clinical trials of mindfulness-based interventions for samples with a clinical disorder or elevated symptoms of a clinical disorder listed on the American Psychological Association's list of disorders with recognized evidence-based treatment.Independent raters screened 9,067 titles and abstracts, with 303 full text reviews. Of these, 171 were included, representing 142 non-overlapping samples.Across the 142 studies published between 2000 and 2016, there was no evidence for increases in any study quality indicator, although changes were generally in the direction of improved quality. When restricting the sample to those conducted in Europe and North America (continents with the longest history of scientific research in this area, an increase in reporting of ITT analyses was found. When excluding an early, high-quality study, improvements were seen in sample size, treatment fidelity assessment, and reporting of ITT analyses.Taken together, the findings suggest modest adoption of the recommendations for methodological improvement voiced repeatedly in the literature. Possible explanations for this and implications for interpreting this body of research and conducting future studies are

  14. Distributed collaborative team effectiveness: measurement and process improvement

    Science.gov (United States)

    Wheeler, R.; Hihn, J.; Wilkinson, B.

    2002-01-01

    This paper describes a measurement methodology developed for assessing the readiness, and identifying opportunities for improving the effectiveness, of distributed collaborative design teams preparing to conduct a coccurent design session.

  15. Mechanism analysis and evaluation methodology of regenerative braking contribution to energy efficiency improvement of electrified vehicles

    International Nuclear Information System (INIS)

    Lv, Chen; Zhang, Junzhi; Li, Yutong; Yuan, Ye

    2015-01-01

    Highlights: • The energy flow of an electric vehicle with regenerative brake is analyzed. • Methodology for measuring the regen brake contribution is discussed. • Evaluation parameters of regen brake contribution are proposed. • Vehicle tests are carried out on chassis dynamometer. • Test results verify the evaluation method and parameters proposed. - Abstract: This article discusses the mechanism and evaluation methods of contribution brought by regenerative braking to electric vehicle’s energy efficiency improvement. The energy flow of an electric vehicle considering the braking energy regeneration was analyzed. Then, methodologies for measuring the contribution made by regenerative brake to vehicle energy efficiency improvement were introduced. Based on the energy flow analyzed, two different evaluation parameters were proposed. Vehicle tests were carried out on chassis dynamometer under typical driving cycles with three different control strategies. The experimental results the difference between the proposed two evaluation parameters, and demonstrated the feasibility and effectiveness of the evaluation methodologies proposed

  16. Measuring the Quality of Publications : New Methodology and Case Study

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; van Groenendaal, W.J.H.

    2000-01-01

    n practice, it is important to evaluate the quality of research, in order to make decisions on tenure, funding, and so on. This article develops a methodology using citations to measure the quality of journals, proceedings, and book publishers. (Citations are also used by the Science and Social

  17. Graduate students' teaching experiences improve their methodological research skills.

    Science.gov (United States)

    Feldon, David F; Peugh, James; Timmerman, Briana E; Maher, Michelle A; Hurst, Melissa; Strickland, Denise; Gilmore, Joanna A; Stiegelmeyer, Cindy

    2011-08-19

    Science, technology, engineering, and mathematics (STEM) graduate students are often encouraged to maximize their engagement with supervised research and minimize teaching obligations. However, the process of teaching students engaged in inquiry provides practice in the application of important research skills. Using a performance rubric, we compared the quality of methodological skills demonstrated in written research proposals for two groups of early career graduate students (those with both teaching and research responsibilities and those with only research responsibilities) at the beginning and end of an academic year. After statistically controlling for preexisting differences between groups, students who both taught and conducted research demonstrate significantly greater improvement in their abilities to generate testable hypotheses and design valid experiments. These results indicate that teaching experience can contribute substantially to the improvement of essential research skills.

  18. Improvements in backscatter measurement devices

    International Nuclear Information System (INIS)

    Saunders, J.; Hay, W.D.

    1978-01-01

    Improvements in measuring the thickness of a coating on a substrate by the technique of backscattered particles are described. These improvements enable the measurements to be carried out continuously as an integral part of the coating production line and also permit measurements where the coated elements are separated from one another by a predetermined distance. The former is achieved by situating the backscatter probe and detector on the rim of the measurement wheel and rotating this wheel at a speed such that the coated element and probe are stationary relative to one another. The latter improvement is achieved by an indexing apparatus which automatically positions the probe beside a coated element. (U.K.)

  19. Some measurement possibilities for the improvement of IRI

    International Nuclear Information System (INIS)

    Serafimov, K.B.

    1984-01-01

    Some methodological assumptions behind the development of improved measurements for use in the International Reference Ionosphere (IRI) are presented. Attention is given to improving the IRI-representation of electron density in the D-region by comparing data from vertical rocket soundings and absorption measurements on multifrequencies ionosondes; by the application of absorption measurements for the specification of density profile structure; and by the use of combined rocket and ground-based measurements. The methodological possibilities, for improving the IRI-distribution of electron densities in the bottomside and topside ionosphere, and for the specification of Te(h) profiles are also discussed

  20. Advanced quantitative measurement methodology in physics education research

    Science.gov (United States)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three

  1. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    International Nuclear Information System (INIS)

    Tarifeño-Saldivia, Ariel; Pavez, Cristian; Soto, Leopoldo; Mayer, Roberto E

    2015-01-01

    This work introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from detection of the burst of neutrons. An improvement of more than one order of magnitude in the accuracy of a paraffin wax moderated 3 He-filled tube is obtained by using this methodology with respect to previous calibration methods. (paper)

  2. Risk importance measures in the dynamic flowgraph methodology

    International Nuclear Information System (INIS)

    Tyrväinen, T.

    2013-01-01

    This paper presents new risk importance measures applicable to a dynamic reliability analysis approach with multi-state components. Dynamic reliability analysis methods are needed because traditional methods, such as fault tree analysis, can describe system's dynamical behaviour only in limited manner. Dynamic flowgraph methodology (DFM) is an approach used for analysing systems with time dependencies and feedback loops. The aim of DFM is to identify root causes of a top event, usually representing the system's failure. Components of DFM models are analysed at discrete time points and they can have multiple states. Traditional risk importance measures developed for static and binary logic are not applicable to DFM as such. Some importance measures have previously been developed for DFM but their ability to describe how components contribute to the top event is fairly limited. The paper formulates dynamic risk importance measures that measure the importances of states of components and take the time-aspect of DFM into account in a logical way that supports the interpretation of results. Dynamic risk importance measures are developed as generalisations of the Fussell-Vesely importance and the risk increase factor. -- Highlights: • New risk importance measures are developed for the dynamic flowgraph methodology. • Dynamic risk importance measures are formulated for states of components. • An approach to handle failure modes of a component in DFM is presented. • Dynamic risk importance measures take failure times into account. • Component's influence on the system's reliability can be analysed in detail

  3. Improvement of laboratory turnaround time using lean methodology.

    Science.gov (United States)

    Gupta, Shradha; Kapil, Sahil; Sharma, Monica

    2018-05-14

    Purpose The purpose of this paper is to discuss the implementation of lean methodology to reduce the turnaround time (TAT) of a clinical laboratory in a super speciality hospital. Delays in report delivery lead to delayed diagnosis increased waiting time and decreased customer satisfaction. The reduction in TAT will lead to increased patient satisfaction, quality of care, employee satisfaction and ultimately the hospital's revenue. Design/methodology/approach The generic causes resulting in increasing TAT of clinical laboratories were identified using lean tools and techniques such as value stream mapping (VSM), Gemba, Pareto Analysis and Root Cause Analysis. VSM was used as a tool to analyze the current state of the process and further VSM was used to design the future state with suggestions for process improvements. Findings This study identified 12 major non-value added factors for the hematology laboratory and 5 major non-value added factors for the biochemistry lab which were acting as bottlenecks resulting in limiting throughput. A four-month research study by the authors together with hospital quality department and laboratory staff members led to reduction of the average TAT from 180 to 95minutes in the hematology lab and from 268 to 208 minutes in the biochemistry lab. Practical implications Very few improvement initiatives in Indian healthcare are based on industrial engineering tools and techniques, which might be due to a lack of interaction between healthcare and engineering. The study provides a positive outcome in terms of improving the efficiency of services in hospitals and identifies a scope for lean in the Indian healthcare sector. Social implications Applying lean in the Indian healthcare sector gives its own potential solution to the problem caused, due to a wide gap between lean accessibility and lean implementation. Lean helped in changing the mindset of an organization toward providing the highest quality of services with faster delivery at

  4. Lean methodology for performance improvement in the trauma discharge process.

    Science.gov (United States)

    O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey

    2014-07-01

    High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.

  5. Using Lean Six Sigma Methodology to Improve Quality of the Anesthesia Supply Chain in a Pediatric Hospital.

    Science.gov (United States)

    Roberts, Renée J; Wilson, Ashley E; Quezado, Zenaide

    2017-03-01

    Six Sigma and Lean methodologies are effective quality improvement tools in many health care settings. We applied the DMAIC methodology (define, measure, analyze, improve, control) to address deficiencies in our pediatric anesthesia supply chain. We defined supply chain problems by mapping existing processes and soliciting comments from those involved. We used daily distance walked by anesthesia technicians and number of callouts for missing supplies as measurements that we analyzed before and after implementing improvements (anesthesia cart redesign). We showed improvement in the metrics after those interventions were implemented, and those improvements were sustained and thus controlled 1 year after implementation.

  6. A symbolic methodology to improve disassembly process design.

    Science.gov (United States)

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  7. Development of a field measurement methodology for studying the thermal indoor environment in hybrid GEOTABS buildings

    DEFF Research Database (Denmark)

    Kazanci, Ongun Berk; Khovalyg, Dolaana; Olesen, Bjarne W.

    2018-01-01

    buildings. The three demonstration buildings were an office building in Luxembourg, an elderly care home in Belgium, and an elementary school in Czech Republic. All of these buildings are equipped with hybrid GEOTABS systems; however, they vary in size and function, which requires a unique measurement...... methodology for studying them. These buildings already have advanced Building Management Systems (BMS); however, a more detailed measurement plan was needed for the purposes of the project to document the current performance of these systems regarding thermal indoor environment and energy performance......, and to be able to document the improvements after the implementation of the MPC. This study provides the details of the developed field measurement methodology for each of these buildings to study the indoor environmental quality (IEQ) in details. The developed measurement methodology can be applied to other...

  8. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  9. THE QUALITY IMPROVEMENT OF PRIMER PACKAGING PROCESS USING SIX SIGMA METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Prima Ditahardiyani

    2008-01-01

    Full Text Available The implementation of Six Sigma has become a common theme in many organizations. This paper presents the Six Sigma methodology and its implementation in a primer packaging process of Cranberry drink. DMAIC (Define, Measure, Analyze, Improve and Control approach is used to analyze and to improve the primer packaging process, which have high variability and defects output. After the improvement, the results showed that there was an increasing sigma level. However, it is not significantly and has not achieved the world standard quality, yet. Therefore, the implementation of Six Sigma in primer packaging process of Cranberry drink still has a room for doing a further research.

  10. Methodology for interpretation of fissile mass flow measurements

    International Nuclear Information System (INIS)

    March-Leuba, J.; Mattingly, J.K.; Mullens, J.A.

    1997-01-01

    This paper describes a non-intrusive measurement technique to monitor the mass flow rate of fissile material in gaseous or liquid streams. This fissile mass flow monitoring system determines the fissile mass flow rate by relying on two independent measurements: (1) a time delay along a given length of pipe, which is inversely proportional to the fissile material flow velocity, and (2) an amplitude measurement, which is proportional to the fissile concentration (e.g., grams of 235 U per length of pipe). The development of this flow monitor was first funded by DOE/NE in September 95, and initial experimental demonstration by ORNL was described in the 37th INMM meeting held in July 1996. This methodology was chosen by DOE/NE for implementation in November 1996; it has been implemented in hardware/software and is ready for installation. This paper describes the methodology used to interpret the data measured by the fissile mass flow monitoring system and the models used to simulate the transport of fission fragments from the source location to the detectors

  11. Process improvement methodologies uncover unexpected gaps in stroke care.

    Science.gov (United States)

    Kuner, Anthony D; Schemmel, Andrew J; Pooler, B Dustin; Yu, John-Paul J

    2018-01-01

    Background The diagnosis and treatment of acute stroke requires timed and coordinated effort across multiple clinical teams. Purpose To analyze the frequency and temporal distribution of emergent stroke evaluations (ESEs) to identify potential contributory workflow factors that may delay the initiation and subsequent evaluation of emergency department stroke patients. Material and Methods A total of 719 sentinel ESEs with concurrent neuroimaging were identified over a 22-month retrospective time period. Frequency data were tabulated and odds ratios calculated. Results Of all ESEs, 5% occur between 01:00 and 07:00. ESEs were most frequent during the late morning and early afternoon hours (10:00-14:00). Unexpectedly, there was a statistically significant decline in the frequency of ESEs that occur at the 14:00 time point. Conclusion Temporal analysis of ESEs in the emergency department allowed us to identify an unexpected decrease in ESEs and through process improvement methodologies (Lean and Six Sigma) and identify potential workflow elements contributing to this observation.

  12. Indoor radon measurements and methodologies in Latin American countries

    International Nuclear Information System (INIS)

    Canoba, A.; Lopez, F.O.; Arnaud, M.I.; Oliveira, A.A.; Neman, R.S.; Hadler, J.C.; Iunes, P.J.; Paulo, S.R.; Osorio, A.M.; Aparecido, R.; Rodriguez, C.; Moreno, V.; Vasquez, R.; Espinosa, G.; Golzarri, J.I.; Martinez, T.; Navarrete, M.; Cabrera, I.; Segovia, N.; Pena, P.; Tamez, E.; Pereyra, P.; Lopez-Herrera, M.E.; Sajo-Bohus, L.

    2001-01-01

    According to the current international guidelines concerning environmental problems, it is necessary to evaluate and to know the indoor radon levels, specially since most of the natural radiation dose to man comes from radon gas and its progeny. Several countries have established National Institutions and National Programs for the study of radon and its connection with lung cancer risk and public health. The aim of this work is to present the indoor radon measurements and the detection methods used for different regions of Latin America (LA) in countries such as Argentina, Brazil, Ecuador, Mexico, Peru and Venezuela. This study shows that the passive radon devices based on alpha particle nuclear track methodology (NTM) is one of the more generalized methods in LA for long term indoor radon measurements, CR-39, LR-115 and Makrofol being the more commonly used detector materials. The participating institutions and the radon level measurements in the different countries are presented in this contribution

  13. Improvement of test methodology for evaluating diesel fuel stability

    Energy Technology Data Exchange (ETDEWEB)

    Gutman, M.; Tartakovsky, L.; Kirzhner, Y.; Zvirin, Y. [Internal Combustion Engines Lab., Haifa (Israel); Luria, D. [Fuel Authority, Tel Aviv (Israel); Weiss, A.; Shuftan, M. [Israel Defence Forces, Tel Aviv (Israel)

    1995-05-01

    The storage stability of diesel fuel has been extensively investigated for many years under laboratory conditions. Although continuous efforts have been made to improve testing techniques, there does not yet exist a generally accepted correlation between laboratory methods (such as chemical analysis of the fuel) and actual diesel engine tests. A testing method was developed by the Technion Internal Combustion Engines Laboratory (TICEL), in order to address this problem. The test procedure was designed to simulate diesel engine operation under field conditions. It is based on running a laboratory-modified single cylinder diesel engine for 50 h under cycling operating conditions. The overall rating of each test is based on individual evaluation of the deposits and residue formation in the fuel filter, nozzle body and needle, piston head, piston rings, exhaust valve, and combustion chamber (six parameters). Two methods for analyzing the test results were used: objective, based on measured data, and subjective, based on visual evaluation results of these deposits by a group of experts. Only the residual level in the fuel filter was evaluated quantitatively by measured results. In order to achieve higher accuracy of the method, the test procedure was improved by introducing the measured results of nozzle fouling as an additional objective evaluating (seventh) parameter. This factor is evaluated on the basis of the change in the air flow rate through the nozzle before and after the complete engine test. Other improvements in the method include the use of the nozzle assembly photograph in the test evaluation, and representation of all seven parameters on a continuous scale instead of the discrete scale used anteriorly, in order to achieve higher accuracy. This paper also contains the results obtained by application of this improved fuel stability test for a diesel fuel stored for a five-year period.

  14. Measuring and improving productivity in general radiology.

    Science.gov (United States)

    Wilt, Michelle A; Miranda, Rafael; Johnson, C Daniel; Love, Peggy Sue

    2010-10-01

    The aim of this study was to determine a method of measuring productivity among general radiographers in a moderate-sized hospital and to improve and sustain productivity within that work area. The average times needed to perform the 13 most common examinations were measured. Performance of the various examinations was tracked and multiplied by the time allocated per procedure; this measure was divided by the length of the work shift to determine productivity. Productivity measures were shared among the work group, and decisions to improve productivity (eg, whether to fill open positions) were made by group members. Average time spent per examination type was calculated (range, 10 minutes to 1 hour 16 minutes). At baseline (February 2008), group productivity was 50%. Productivity increased during the first year of monitoring and was sustained through November 2009 (productivity range, 57%-63%). Yearly savings from not filling open positions were estimated to be $174,000. Productivity in a general radiology work area can be measured. Consensus among the work group helped increase productivity and assess progress. This methodology, if widely adopted, could be standardized and used to compare productivity across departments and institutions. Copyright © 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  15. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    Science.gov (United States)

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  16. Systematic Review of the Application of Lean and Six Sigma Quality Improvement Methodologies in Radiology.

    Science.gov (United States)

    Amaratunga, Thelina; Dobranowski, Julian

    2016-09-01

    Preventable yet clinically significant rates of medical error remain systemic, while health care spending is at a historic high. Industry-based quality improvement (QI) methodologies show potential for utility in health care and radiology because they use an empirical approach to reduce variability and improve workflow. The aim of this review was to systematically assess the literature with regard to the use and efficacy of Lean and Six Sigma (the most popular of the industrial QI methodologies) within radiology. MEDLINE, the Allied & Complementary Medicine Database, Embase Classic + Embase, Health and Psychosocial Instruments, and the Ovid HealthStar database, alongside the Cochrane Library databases, were searched on June 2015. Empirical studies in peer-reviewed journals were included if they assessed the use of Lean, Six Sigma, or Lean Six Sigma with regard to their ability to improve a variety of quality metrics in a radiology-centered clinical setting. Of the 278 articles returned, 23 studies were suitable for inclusion. Of these, 10 assessed Six Sigma, 7 assessed Lean, and 6 assessed Lean Six Sigma. The diverse range of measured outcomes can be organized into 7 common aims: cost savings, reducing appointment wait time, reducing in-department wait time, increasing patient volume, reducing cycle time, reducing defects, and increasing staff and patient safety and satisfaction. All of the included studies demonstrated improvements across a variety of outcomes. However, there were high rates of systematic bias and imprecision as per the Grading of Recommendations Assessment, Development and Evaluation guidelines. Lean and Six Sigma QI methodologies have the potential to reduce error and costs and improve quality within radiology. However, there is a pressing need to conduct high-quality studies in order to realize the true potential of these QI methodologies in health care and radiology. Recommendations on how to improve the quality of the literature are proposed

  17. Measuring and improving customer satisfaction with government services

    Science.gov (United States)

    Glen D. Alexander

    1995-01-01

    Two years ago, Ohio State Park developed a methodology of measuring customer satisfaction, to gauge the effectiveness of our customer service. What follows is a discussion of our installation of systems to measure and improve customer satisfaction, the interpretation of the data, and the positive results we have enjoyed.

  18. Methodological aspects of EEG and Body dynamics measurements during motion.

    Directory of Open Access Journals (Sweden)

    Pedro eReis

    2014-03-01

    Full Text Available EEG involves recording, analysis, and interpretation of voltages recorded on the human scalp originating from brain grey matter. EEG is one of the favorite methods to study and understand processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements, that are performed in response to the environment. However, there are methodological difficulties when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions of how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determination of real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks.

  19. Improved cook stove adoption and impact assessment: A proposed methodology

    International Nuclear Information System (INIS)

    Troncoso, Karin; Armendáriz, Cynthia; Alatorre, Silvia

    2013-01-01

    Aims: Until now, the success of improved cook stoves (ICS) implementation programs has usually been measured by the number of ICS distributed. Some important research has been conducted to try to determine the effects of the use of an ICS in the user′s health, but these studies are expensive and time consuming. Moreover, no evaluations show the impact of the technology in the user′s lives. This study seeks to contribute to fill this gap. Scope: By applying cluster analysis techniques to survey data, the most relevant variables that explain adoption and impact were identified. Using these variables, two qualitative indexes are proposed: The adoption index considers the use of the new technology, the level of satisfaction, and the conditions of the stove. The impact index considers the changes in cooking practices and life quality brought about by the ICS. Both indexes are then applied to two implementation programs. The indexes show the differences between the program results and the user′s perceptions of each technology. Conclusions: The proposed indexes can be used to measure the success of an ICS implementation program in terms of the benefits perceived by the users of these technologies. -- Highlights: •Two qualitative indexes are proposed to measure the benefits perceived by ICS users. •Two implementation programs were assessed. •The approach enables determining the impact of ICS programs at a fraction of the cost. •It enables comparing the results of different implementation programs

  20. Relative Hazard and Risk Measure Calculation Methodology Rev 1

    International Nuclear Information System (INIS)

    Stenner, Robert D.; White, Michael K.; Strenge, Dennis L.; Aaberg, Rosanne L.; Andrews, William B.

    2000-01-01

    Documentation of the methodology used to calculate relative hazard and risk measure results for the DOE complex wide risk profiles. This methodology is used on major site risk profiles. In February 1997, the Center for Risk Excellence (CRE) was created and charged as a technical, field-based partner to the Office of Science and Risk Policy (EM-52). One of the initial charges to the CRE is to assist the sites in the development of ''site risk profiles.'' These profiles are to be relatively short summaries (periodically updated) that present a broad perspective on the major risk related challenges that face the respective site. The risk profiles are intended to serve as a high-level communication tool for interested internal and external parties to enhance the understanding of these risk-related challenges. The risk profiles for each site have been designed to qualitatively present the following information: (1) a brief overview of the site, (2) a brief discussion on the historical mission of the site, (3) a quote from the site manager indicating the site's commitment to risk management, (4) a listing of the site's top risk-related challenges, (5) a brief discussion and detailed table presenting the site's current risk picture, (6) a brief discussion and detailed table presenting the site's future risk reduction picture, and (7) graphic illustrations of the projected management of the relative hazards at the site. The graphic illustrations were included to provide the reader of the risk profiles with a high-level mental picture to associate with all the qualitative information presented in the risk profile. Inclusion of these graphic illustrations presented the CRE with the challenge of how to fold this high-level qualitative risk information into a system to produce a numeric result that would depict the relative change in hazard, associated with each major risk management action, so it could be presented graphically. This report presents the methodology developed

  1. Methodological considerations for measuring glucocorticoid metabolites in feathers

    Science.gov (United States)

    Berk, Sara A.; McGettrick, Julie R.; Hansen, Warren K.; Breuner, Creagh W.

    2016-01-01

    In recent years, researchers have begun to use corticosteroid metabolites in feathers (fCORT) as a metric of stress physiology in birds. However, there remain substantial questions about how to measure fCORT most accurately. Notably, small samples contain artificially high amounts of fCORT per millimetre of feather (the small sample artefact). Furthermore, it appears that fCORT is correlated with circulating plasma corticosterone only when levels are artificially elevated by the use of corticosterone implants. Here, we used several approaches to address current methodological issues with the measurement of fCORT. First, we verified that the small sample artefact exists across species and feather types. Second, we attempted to correct for this effect by increasing the amount of methanol relative to the amount of feather during extraction. We consistently detected more fCORT per millimetre or per milligram of feather in small samples than in large samples even when we adjusted methanol:feather concentrations. We also used high-performance liquid chromatography to identify hormone metabolites present in feathers and measured the reactivity of these metabolites against the most commonly used antibody for measuring fCORT. We verified that our antibody is mainly identifying corticosterone (CORT) in feathers, but other metabolites have significant cross-reactivity. Lastly, we measured faecal glucocorticoid metabolites in house sparrows and correlated these measurements with corticosteroid metabolites deposited in concurrently grown feathers; we found no correlation between faecal glucocorticoid metabolites and fCORT. We suggest that researchers should be cautious in their interpretation of fCORT in wild birds and should seek alternative validation methods to examine species-specific relationships between environmental challenges and fCORT. PMID:27335650

  2. Measurements of Gluconeogenesis and Glycogenolysis: A Methodological Review.

    Science.gov (United States)

    Chung, Stephanie T; Chacko, Shaji K; Sunehag, Agneta L; Haymond, Morey W

    2015-12-01

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to determine gluconeogenesis is by measuring the incorporation of deuterium from the body water pool into newly formed glucose. However, several techniques using radioactive and stable-labeled isotopes have been used to quantitate the contribution and regulation of gluconeogenesis in humans. Each method has its advantages, methodological assumptions, and set of propagated errors. In this review, we examine the strengths and weaknesses of the most commonly used stable isotopes methods to measure gluconeogenesis in vivo. We discuss the advantages and limitations of each method and summarize the applicability of these measurements in understanding normal and pathophysiological conditions. © 2015 by the American Diabetes Association. Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered.

  3. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    Science.gov (United States)

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  4. Measurement of Quality of Life I. A Methodological Framework

    Directory of Open Access Journals (Sweden)

    Soren Ventegodt

    2003-01-01

    Full Text Available Despite the widespread acceptance of quality of life (QOL as the ideal guideline in healthcare and clinical research, serious conceptual and methodological problems continue to plague this area. In an attempt to remedy this situation, we propose seven criteria that a quality-of-life concept must meet to provide a sound basis for investigation by questionnaire. The seven criteria or desiderata are: (1 an explicit definition of quality of life; (2 a coherent philosophy of human life from which the definition is derived; (3 a theory that operationalizes the philosophy by specifying unambiguous, nonoverlapping, and jointly exhaustive questionnaire items; (4 response alternatives that permit a fraction-scale interpretation; (5 technical checks of reproducibility; (6 meaningfulness to investigators, respondents, and users; and (7 an overall aesthetic appeal of the questionnaire. These criteria have guided the design of a validated 5-item generic, global quality-of-life questionnaire (QOL5, and a validated 317-item generic, global quality-of-life questionnaire (SEQOL, administered to a well-documented birth cohort of 7,400 Danes born in 1959�1961, as well as to a reference sample of 2,500 Danes. Presented in outline, the underlying integrative quality-of-life (IQOL theory is a meta-theory. To illustrate the seven criteria at work, we show the extent to which they are satisfied by one of the eight component theories. Next, two sample results of our investigation are presented: satisfaction with one's sex life has the expected covariation with one's quality of life, and so does mother's smoking during pregnancy, albeit to a much smaller extent. It is concluded that the methodological framework presented has proved helpful in designing a questionnaire that is capable of yielding acceptably valid and reliable measurements of global and generic quality of life.

  5. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  6. Application of Taguchi methodology to improve the functional quality of a mechanical device

    International Nuclear Information System (INIS)

    Regeai, Awatef Omar

    2005-01-01

    Manufacturing and quality control are recognized branches of engineering management. special attention has been made to improve thr tools and methods for the purpose of improving the products quality and finding solutions for any Obstacles and/or problems during the production process. Taguchi methodology is one of the most powerful techniques for improving product and manufacturing process quality at low cost. It is a strategical and practical method that aims to assist managers and industrial engineers to tackle manufacturing quality problems in a systematic and structured manner. The potential benefit of Taguchi methodology lies in its ease of use, its emphasis on reducing variability to give more economical products and hence the accessibility to the engineering fraternity for solving real life quality problems. This study applies Taguchi methodology to improve the functional quality of a local made chain gear by a purposed heat treatment process. The hardness of steel is generally a function not of its composition only, but rather of its heat treatment. The study investigates the effects of various heat treatment parameters, including ramp rate of heating, normalizing holding time, normalizing temperature, annealing holding time, annealing temperature, hardening holding time, hardening temperature, quenching media, tempering temperature and tempering holding time upon the hardness, which is a measure of resistance to plastic deformation. Both the analysis of means (ANOM) and Signal to Noise ratio (S/N) have been carried out for determining the optimal condition of the process. A significant improvement of the functional quality characteristic (hardness) by more than 32% was obtained. The Scanning Electron Microscopy technique was used in this study to obtain visual evidence of the quality and continuous improvement of the heat treated samples. (author)

  7. Developing a lean measurement system to enhance process improvement

    Directory of Open Access Journals (Sweden)

    Lewis P.

    2013-01-01

    Full Text Available A key ingredient to underpin process improvement is a robust, reliable, repeatable measurement system. Process improvement activity needs to be supported by accurate and precise data because effective decision making, within process improvement activity, demands the use of “hard” data. One of the oldest and most established process improvement methods is Deming’s Plan-Do-Check-Act (PDCA cycle which is reliant on the check phase, a measurement activity where data is being gathered and evaluated. Recent expansions of the PDCA such as the Six-Sigma Define-Measure-Analyse-Improve-Control (DMAIC methodology place significant importance upon measurement. The DMAIC cycle incorporates the regimented requirement for the inclusion of measurement system analysis (MSA into the breakthrough strategy. The call for MSA within the DMAIC cycle is to provide the improvement activity with a robust measurement system that will ensure a pertinent level of data during any validation process. The Lean methodology is heavily centred on the removal of the seven Mudas (wastes from a manufacturing process: defects, overproduction, transportation, waiting, inventory, motion and processing. The application of lean, particularly within the manufacturing industry, has led to a perception that measurement is a waste within a manufacturing process because measurement processes identify defective products. The metrologists’ pursuit for measurement excellence could be construed as a hindrance by the “cost down” demands being perpetrated from the same organisation’s lean policy. So what possible benefits does enforcing the regimes of the lean and quality philosophies upon the measurement process have and how does this ultimately enhance the process improvement activity? The key fundamental to embed with any process improvement is the removal of waste. The process improvement techniques embedded within lean and quality concepts are extremely powerful practices in the

  8. Methodological Improvements in Combining TMS and Functional MRI

    OpenAIRE

    Moisa, Marius

    2011-01-01

    Since 1997, when Bohning and colleagues demonstrated for the first time the feasibility of interleaving transcranial magnetic stimulation (TMS) with blood oxygenation level dependency functional magnetic resonance imaging (BOLD fMRI), this combination became a very promising techniques to study brain connectivity. However, the implementation of a reliable setup for interleaved TMS/fMRI is still technically challenging. In this thesis, I intended to further explore and develop methodological i...

  9. Establishment of Assessment Methodology Improvement of IAEA INPRO Proliferation Resistance

    International Nuclear Information System (INIS)

    Park, J. H.; Yang, M. S.; Song, K. C.; Ko, W. I.; Kim, H. D.; Kim, Y. I.; Rhee, B. W.; Kim, H. T.

    2008-03-01

    For the development of assessment methodology of acquisition and diversion pathway of nuclear material, the PR assessment methodology which has been developed by GEN-IV PR and PP group was reviewed regarding the acquisition and diversion pathway of nuclear material and we proposed the research areas to develop the model of acquisition and diversion pathway of nuclear material including misuse of fuel cycle facilities and one of the IAEA INPRO CRPs which is aiming to develop its model. From the present study, its preliminary model for acquisition and diversion pathway of nuclear material was obtained. For preliminary evaluation of DUPIC system using methodology of acquisition/diversion pathway of nm and review of pyro-processing system characteristics, the research direction and work procedure was established to develop the assessment methodology of User Requirement 4 of INPRO PR by; 1) selection of the possible pathway to acquire and divert the nuclear material of DUPIC system, 2) the analysis of selected pathway, 3) the development of the assessment methodology of robustness and multiplicity of an INS. And, the PR characteristics and process/material flow analysis of the Pyro-processing system were preliminarily studied. For establishment of R and D direction for an INS and supporting international cooperation research, the collaborative research project titled as 'Acquisition and Diversion pathway analysis of Proliferation Resistance' as one of activities of IAEA INPRO was proposed, since Korean Government decided to actively support the IAEA INPRO. In order to review and clarify the Terms of Reference (TOR) of a Korean Proposed Collaborative Project (ROK1), two INPRO Consultancy Meetings were held. Its results were presented at two INPRO Steering Committees and the finalized TOR of Korean Proposal submit the 12-th INPRO Steering Committee Meeting which was held Dec. 3-5 2007. Four participants including USA, Canada, China and European Community (EC) have decided

  10. Establishment of Assessment Methodology Improvement of IAEA INPRO Proliferation Resistance

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. H.; Yang, M. S.; Song, K. C.; Ko, W. I.; Kim, H. D.; Kim, Y. I.; Rhee, B. W.; Kim, H. T.

    2008-03-15

    For the development of assessment methodology of acquisition and diversion pathway of nuclear material, the PR assessment methodology which has been developed by GEN-IV PR and PP group was reviewed regarding the acquisition and diversion pathway of nuclear material and we proposed the research areas to develop the model of acquisition and diversion pathway of nuclear material including misuse of fuel cycle facilities and one of the IAEA INPRO CRPs which is aiming to develop its model. From the present study, its preliminary model for acquisition and diversion pathway of nuclear material was obtained. For preliminary evaluation of DUPIC system using methodology of acquisition/diversion pathway of nm and review of pyro-processing system characteristics, the research direction and work procedure was established to develop the assessment methodology of User Requirement 4 of INPRO PR by; 1) selection of the possible pathway to acquire and divert the nuclear material of DUPIC system, 2) the analysis of selected pathway, 3) the development of the assessment methodology of robustness and multiplicity of an INS. And, the PR characteristics and process/material flow analysis of the Pyro-processing system were preliminarily studied. For establishment of R and D direction for an INS and supporting international cooperation research, the collaborative research project titled as 'Acquisition and Diversion pathway analysis of Proliferation Resistance' as one of activities of IAEA INPRO was proposed, since Korean Government decided to actively support the IAEA INPRO. In order to review and clarify the Terms of Reference (TOR) of a Korean Proposed Collaborative Project (ROK1), two INPRO Consultancy Meetings were held. Its results were presented at two INPRO Steering Committees and the finalized TOR of Korean Proposal submit the 12-th INPRO Steering Committee Meeting which was held Dec. 3-5 2007. Four participants including USA, Canada, China and European Community (EC

  11. Measuring service line competitive position. A systematic methodology for hospitals.

    Science.gov (United States)

    Studnicki, J

    1991-01-01

    To mount a broad effort aimed at improving their competitive position for some service or group of services, hospitals have begun to pursue product line management techniques. A few hospitals have even reorganized completely under the product line framework. The benefits include focusing accountability for operations and results, facilitating coordination between departments and functions, stimulating market segmentation, and promoting rigorous examination of new and existing programs. As part of its strategic planning process, a suburban Baltimore hospital developed a product line management methodology with six basic steps: (1) define the service lines (which they did by grouping all existing diagnosis-related groups into 35 service lines), (2) determine the contribution of each service line to total inpatient volume, (3) determine trends in service line volumes (by comparing data over time), (4) derive a useful comparison group (competing hospitals or groups of hospitals with comparable size, scope of services, payer mix, and financial status), (5) review multiple time frames, and (6) summarize the long- and short-term performance of the hospital's service lines to focus further analysis. This type of systematic and disciplined analysis can become part of a permanent strategic intelligence program. When hospitals have such a program in place, their market research, planning, budgeting, and operations will be tied together in a true management decision support system.

  12. Adaptability of laser diffraction measurement technique in soil physics methodology

    Science.gov (United States)

    Barna, Gyöngyi; Szabó, József; Rajkai, Kálmán; Bakacsi, Zsófia; Koós, Sándor; László, Péter; Hauk, Gabriella; Makó, András

    2016-04-01

    There are intentions all around the world to harmonize soils' particle size distribution (PSD) data by the laser diffractometer measurements (LDM) to that of the sedimentation techniques (pipette or hydrometer methods). Unfortunately, up to the applied methodology (e. g. type of pre-treatments, kind of dispersant etc.), PSDs of the sedimentation methods (due to different standards) are dissimilar and could be hardly harmonized with each other, as well. A need was arisen therefore to build up a database, containing PSD values measured by the pipette method according to the Hungarian standard (MSZ-08. 0205: 1978) and the LDM according to a widespread and widely used procedure. In our current publication the first results of statistical analysis of the new and growing PSD database are presented: 204 soil samples measured with pipette method and LDM (Malvern Mastersizer 2000, HydroG dispersion unit) were compared. Applying usual size limits at the LDM, clay fraction was highly under- and silt fraction was overestimated compared to the pipette method. Subsequently soil texture classes determined from the LDM measurements significantly differ from results of the pipette method. According to previous surveys and relating to each other the two dataset to optimizing, the clay/silt boundary at LDM was changed. Comparing the results of PSDs by pipette method to that of the LDM, in case of clay and silt fractions the modified size limits gave higher similarities. Extension of upper size limit of clay fraction from 0.002 to 0.0066 mm, and so change the lower size limit of silt fractions causes more easy comparability of pipette method and LDM. Higher correlations were found between clay content and water vapor adsorption, specific surface area in case of modified limit, as well. Texture classes were also found less dissimilar. The difference between the results of the two kind of PSD measurement methods could be further reduced knowing other routinely analyzed soil parameters

  13. Personal dosimetry service of TECNATOM: measurement system and methodology of calibration

    International Nuclear Information System (INIS)

    Marchena, Paloma; Bravo, Borja

    2008-01-01

    Full text: The implementation of a new integrated and practical working tool called ALEDIN within the Personal Dosimetry Service (PDS) of TECNATOM, have harmonized the methodology for the counting acquisition, detector calibration and data analysis using a friendly Windows (registered mark) environment. The knowledge of this methodology, due to the fact that is the final product of a R and D project, will help the users and the Regulatory Body for a better understanding of the internal activity measurement in individuals, allowing a more precise error identification and correction, and improving the whole process of the internal dosimetry. The development and implementation of a new calibration system of the whole body counters using NaI (Tl) detectors and the utilization of a new humanoid anthropometric phantom, BOMAB type, with a uniform radioactive source distributions, allow a better energy and activity calibration for different counting geometries covering a wide range of gamma spectra from low energies, less than 100 keV to about 2000 keV for the high energies spectra. This new calibration methodology implied the development of an improved system for the determination of the isotopic activity. This new system has been integrated in a Windows (registered mark) environment, applicable for counting acquisition and data analysis in the whole body counters WBC in cross connection with the INDAC software, which allow the interpretation of the measured activity as committed effective dose following all the new ICRP recommendations and dosimetric models for internal dose and bioassay measurements. (author)

  14. Speciated arsenic in air: measurement methodology and risk assessment considerations.

    Science.gov (United States)

    Lewis, Ari S; Reid, Kim R; Pollock, Margaret C; Campleman, Sharan L

    2012-01-01

    Accurate measurement of arsenic (As) in air is critical to providing a more robust understanding of arsenic exposures and associated human health risks. Although there is extensive information available on total arsenic in air, less is known on the relative contribution of each arsenic species. To address this data gap, the authors conducted an in-depth review of available information on speciated arsenic in air. The evaluation included the type of species measured and the relative abundance, as well as an analysis of the limitations of current analytical methods. Despite inherent differences in the procedures, most techniques effectively separated arsenic species in the air samples. Common analytical techniques such as inductively coupled plasma mass spectrometry (ICP-MS) and/or hydride generation (HG)- or quartz furnace (GF)-atomic absorption spectrometry (AAS) were used for arsenic measurement in the extracts, and provided some of the most sensitive detection limits. The current analysis demonstrated that, despite limited comparability among studies due to differences in seasonal factors, study duration, sample collection methods, and analytical methods, research conducted to date is adequate to show that arsenic in air is mainly in the inorganic form. Reported average concentrations of As(III) and As(V) ranged up to 7.4 and 10.4 ng/m3, respectively, with As(V) being more prevalent than As(III) in most studies. Concentrations of the organic methylated arsenic compounds are negligible (in the pg/m3 range). However because of the variability in study methods and measurement methodology, the authors were unable to determine the variation in arsenic composition as a function of source or particulate matter (PM) fraction. In this work, the authors include the implications of arsenic speciation in air on potential exposure and risks. The authors conclude that it is important to synchronize sample collection, preparation, and analytical techniques in order to generate

  15. The Assessment Methodology RADAR – A Theoretical Approach of a Methodology for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence

    Directory of Open Access Journals (Sweden)

    Cristina Raluca Popescu

    2015-05-01

    Full Text Available In the paper “The Assessment Methodology RADAR – A Theoretical Approach of a Methodology for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodology RADAR that is designed to coordinate the efforts to improve the organizational processes in order to achieve excellence.

  16. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    Directory of Open Access Journals (Sweden)

    Johnston Marie

    2007-03-01

    Full Text Available Abstract Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model? and ii methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?. Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological

  17. Aqueduct: a methodology to measure and communicate global water risks

    Science.gov (United States)

    Gassert, Francis; Reig, Paul

    2013-04-01

    The Aqueduct Water Risk Atlas (Aqueduct) is a publicly available, global database and interactive tool that maps indicators of water related risks for decision makers worldwide. Aqueduct makes use of the latest geo-statistical modeling techniques to compute a composite index and translate the most recently available hydrological data into practical information on water related risks for companies, investors, and governments alike. Twelve global indicators are grouped into a Water Risk Framework designed in response to the growing concerns from private sector actors around water scarcity, water quality, climate change, and increasing demand for freshwater. The Aqueduct framework organizes indicators into three categories of risk that bring together multiple dimensions of water related risk into comprehensive aggregated scores and includes indicators of water stress, variability in supply, storage, flood, drought, groundwater, water quality and social conflict, addressing both spatial and temporal variation in water hazards. Indicators are selected based on relevance to water users, availability and robustness of global data sources, and expert consultation, and are collected from existing datasets or derived from a Global Land Data Assimilation System (GLDAS) based integrated water balance model. Indicators are normalized using a threshold approach, and composite scores are computed using a linear aggregation scheme that allows for dynamic weighting to capture users' unique exposure to water hazards. By providing consistent scores across the globe, the Aqueduct Water Risk Atlas enables rapid comparison across diverse aspects of water risk. Companies can use this information to prioritize actions, investors to leverage financial interest to improve water management, and governments to engage with the private sector to seek solutions for more equitable and sustainable water governance. The Aqueduct Water Risk Atlas enables practical applications of scientific data

  18. Improved Methodology for Benefit Estimation of Preservation Projects

    Science.gov (United States)

    2018-04-01

    This research report presents an improved process for evaluating the benefits and economic tradeoffs associated with a variety of highway preservation projects. It includes a summary of results from a comprehensive phone survey concerning the use and...

  19. Measuring and improving infrastructure performance

    National Research Council Canada - National Science Library

    Committee on Measuring and Improving Infrastructure Performance, National Research Council

    .... Developing a framework for guiding attempts at measuring the performance of infrastructure systems and grappling with the concept of defining good performance are the major themes of this book...

  20. THE INTRODUCTION OF THE METHODOLOGY TO IMPROVE ROAD SAFETY

    Directory of Open Access Journals (Sweden)

    D. V. Kapsky

    2013-01-01

    Full Text Available Recommendations for improving the road safety and quality of road traffic controlled junctions(crossings on individual parameters of traffic light control, improvement of traffic light control by optimizing the length of the transition interval in the traffic light cycle, increase awareness and early warning drivers about the upcoming change traffic lights division of transport and pedestrian traffic, road conditions , transportation planning and technical aids of road  traffic, as well as recommendations for the use of the hump in the settlements, etc.

  1. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  2. Methodologies for Improving Flight Project Information Capture, Storage, and Dissemination

    Science.gov (United States)

    Equils, Douglas J.

    2011-01-01

    This paper will discuss the drawbacks and risks of the current documentation paradigm, how Document QuickStart improves on that process and ultimately how this stream-lined approach will reduce risk and costs to the next generation of Flight Projects at JPL

  3. New Teaching Techniques to Improve Critical Thinking. The Diaprove Methodology

    Science.gov (United States)

    Saiz, Carlos; Rivas, Silvia F.

    2016-01-01

    The objective of this research is to ascertain whether new instructional techniques can improve critical thinking. To achieve this goal, two different instruction techniques (ARDESOS--group 1--and DIAPROVE--group 2--) were studied and a pre-post assessment of critical thinking in various dimensions such as argumentation, inductive reasoning,…

  4. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  5. Design, implementation and licensing of improvement measures

    International Nuclear Information System (INIS)

    Rousset, P.

    2000-01-01

    Modifications are designed to contribute to improve the level of safety, and the level of performance of the plants. The successful implementation of a modification depends on the quality of the design and particularly the good fitting of the modification in the existing design, on the perfect consistency of the licensing approach and on the good management of the project constraints and interfaces. Over the last ten years, FRAMATOME has accumulated a large and varied experience in these activities, as shown in the few examples briefly described in the presentation. Within the Mochovce completion project, FRAMATOME has been in charge of the basic design or implementation of about twenty safety measures. This involved establishing efficient partnerships with design organizations and suppliers in the Slovak Republic, Czech Republic, and Russia. A similar approach is used in the frame of the Kozloduy 5-6 modernization program, where the basic engineering contract is conducted since 1998 and the main contract under discussion. From the very beginning of the TACIS program FRAMATOME has been involved in projects dealing with the design of reactor safety systems. For example, an extensive work aiming at developing a methodology for accident analysis in VVER reactor was initiated in 1993, in collaboration with Siemens, Kurchatov Institute and OKB Gidropress. This methodology was recently used successfully in a new project with the objective to evaluate the possible modification of reactor protection signals of some Russian reactors. Within the PHARE program, a complete analysis of the primary to secondary leakage risk was conducted for Paks nuclear power plant. This involved the writing and validation of the relevant emergency procedure, specification of the leak detection system, study of an improved design of the collector cover, and recommendation of some systems modifications. A last example is the study of the modifications of main steam lines performed in the frame

  6. Measuring Instruments Control Methodology Performance for Analog Electronics Remote Labs

    Directory of Open Access Journals (Sweden)

    Unai Hernandez-Jayo

    2012-12-01

    Full Text Available This paper presents the work that has been developed in parallel to the VISIR project. The objective of this paper is to present the results of the validations processes that have been carried out to check the control methodology. This method has been developed with the aim of being independent of the instruments of the labs.

  7. Scanner image methodology (SIM) to measure dimensions of leaves ...

    African Journals Online (AJOL)

    A scanner image methodology was used to determine plant dimensions, such as leaf area, length and width. The values obtained using SIM were compared with those recorded by the LI-COR leaf area meter. Bias, linearity, reproducibility and repeatability (R&R) were evaluated for SIM. Different groups of leaves were ...

  8. How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?

    Science.gov (United States)

    Risk, D. A.; O'Connell, L.; Atherton, E.

    2017-12-01

    In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over

  9. Process improvement by cycle time reduction through Lean Methodology

    Science.gov (United States)

    Siva, R.; patan, Mahamed naveed khan; lakshmi pavan kumar, Mane; Purusothaman, M.; pitchai, S. Antony; Jegathish, Y.

    2017-05-01

    In present world, every customer needs their products to get on time with good quality. Presently every industry is striving to satisfy their customer requirements. An aviation concern trying to accomplish continuous improvement in all its projects. In this project the maintenance service for the customer is analyzed. The maintenance part service is split up into four levels. Out of it, three levels are done in service shops and the fourth level falls under customer’s privilege to change the parts in their aircraft engines at their location. An enhancement for electronics initial provisioning (eIP) is done for fourth level. Customers request service shops to get their requirements through Recommended Spare Parts List (RSPL) by eIP. To complete this RSPL for one customer, it takes 61.5 hours as a cycle time which is very high. By mapping current state VSM and takt time, future state improvement can be done in order to reduce cycle time using Lean tools such as Poke-Yoke, Jidoka, 5S, Muda etc.,

  10. Intravascular volume in cirrhosis. Reassessment using improved methodology

    International Nuclear Information System (INIS)

    Rector, W.G. Jr.; Ibarra, F.

    1988-01-01

    Previous studies of blood volume (BV) in cirrhosis have either not adjusted BV properly for body size; determined plasma volume from the dilution of labeled albumin 10-20 min postinjection, when some extravascular redistribution has already occurred; and/or not used the correct whole body-peripheral hematocrit ratio (0.82) in calculating whole BV from plasma volume and the peripheral hematocrit. We measured BV with attention to these considerations in 19 patients with cirrhosis and reexamined the determinants of vascular volume and the relationship between vascular volume and sodium retention. BV was calculated as plasma volume (determined from extrapolated plasma activity of intravenously injected [ 131 I]+albumin at time 0) divided by (peripheral hematocrit X 0.82). The result was expressed per kilogram dry body weight, determined by subtracting the mass of ascites (measured by isotope dilution; 1 liter = 1 kg) from the actual body weight of nonedematous patients. Measured and expressed in this way, BV correlated strongly with esophageal variceal size (r = 0.87, P less than 0.05), although not with net portal, right atrial, inferior vena caval, or arterial pressure, and was significantly greater in patients with sodium retention as compared to patients without sodium retention. The principal modifier of vascular volume in cirrhosis is vascular capacity, which is probably mainly determined by the extent of the portasystemic collateral circulation. Increased vascular volume in patients with sodium retention as compared to patients without sodium retention supports the overflow theory of ascites formation

  11. Residency Training: Quality improvement projects in neurology residency and fellowship: applying DMAIC methodology.

    Science.gov (United States)

    Kassardjian, Charles D; Williamson, Michelle L; van Buskirk, Dorothy J; Ernste, Floranne C; Hunderfund, Andrea N Leep

    2015-07-14

    Teaching quality improvement (QI) is a priority for residency and fellowship training programs. However, many medical trainees have had little exposure to QI methods. The purpose of this study is to review a rigorous and simple QI methodology (define, measure, analyze, improve, and control [DMAIC]) and demonstrate its use in a fellow-driven QI project aimed at reducing the number of delayed and canceled muscle biopsies at our institution. DMAIC was utilized. The project aim was to reduce the number of delayed muscle biopsies to 10% or less within 24 months. Baseline data were collected for 12 months. These data were analyzed to identify root causes for muscle biopsy delays and cancellations. Interventions were developed to address the most common root causes. Performance was then remeasured for 9 months. Baseline data were collected on 97 of 120 muscle biopsies during 2013. Twenty biopsies (20.6%) were delayed. The most common causes were scheduling too many tests on the same day and lack of fasting. Interventions aimed at patient education and biopsy scheduling were implemented. The effect was to reduce the number of delayed biopsies to 6.6% (6/91) over the next 9 months. Familiarity with QI methodologies such as DMAIC is helpful to ensure valid results and conclusions. Utilizing DMAIC, we were able to implement simple changes and significantly reduce the number of delayed muscle biopsies at our institution. © 2015 American Academy of Neurology.

  12. Improved capacitive melting curve measurements

    International Nuclear Information System (INIS)

    Sebedash, Alexander; Tuoriniemi, Juha; Pentti, Elias; Salmela, Anssi

    2009-01-01

    Sensitivity of the capacitive method for determining the melting pressure of helium can be enhanced by loading the empty side of the capacitor with helium at a pressure nearly equal to that desired to be measured and by using a relatively thin and flexible membrane in between. This way one can achieve a nanobar resolution at the level of 30 bar, which is two orders of magnitude better than that of the best gauges with vacuum reference. This extends the applicability of melting curve thermometry to lower temperatures and would allow detecting tiny anomalies in the melting pressure, which must be associated with any phenomena contributing to the entropy of the liquid or solid phases. We demonstrated this principle in measurements of the crystallization pressure of isotopic helium mixtures at millikelvin temperatures by using partly solid pure 4 He as the reference substance providing the best possible universal reference pressure. The achieved sensitivity was good enough for melting curve thermometry on mixtures down to 100 μK. Similar system can be used on pure isotopes by virtue of a blocked capillary giving a stable reference condition with liquid slightly below the melting pressure in the reference volume. This was tested with pure 4 He at temperatures 0.08-0.3 K. To avoid spurious heating effects, one must carefully choose and arrange any dielectric materials close to the active capacitor. We observed some 100 pW loading at moderate excitation voltages.

  13. Improvement of axial power distribution synthesis methodology in CPC

    International Nuclear Information System (INIS)

    Kim, H. H.; Gee, S. G.;; Kim, Y. B.; In, W. K.

    2003-01-01

    The capability of axial power distribution synthesis in CPC plays an important role in determining the DNBR and LPD trip caused by CPC. The axial power distribution is synthesized using the cubic spline function based on the three excore detector signals. The axial power distributions are categorized into 8 function sets and each sets are stored as pre-calculated values in CPC to save the calculation time. In this study, the additional function sets, the real break-point function sets and the polynomial function are suggested to evaluate the possibility of improving the synthesis capability in CPC. In addition, RMS errors are compared and evaluated for each synthesis method. As a result, it was confirmed that the function sets stored in CPC were not optimal. The analysis result showed that RMS error could be reduced by selecting the proper function sets suggested in this study

  14. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  15. Quality Improvement Methodologies Increase Autologous Blood Product Administration

    Science.gov (United States)

    Hodge, Ashley B.; Preston, Thomas J.; Fitch, Jill A.; Harrison, Sheilah K.; Hersey, Diane K.; Nicol, Kathleen K.; Naguib, Aymen N.; McConnell, Patrick I.; Galantowicz, Mark

    2014-01-01

    Abstract: Whole blood from the heart–lung (bypass) machine may be processed through a cell salvaging device (i.e., cell saver [CS]) and subsequently administered to the patient during cardiac surgery. It was determined at our institution that CS volume was being discarded. A multidisciplinary team consisting of anesthesiologists, perfusionists, intensive care physicians, quality improvement (QI) professionals, and bedside nurses met to determine the challenges surrounding autologous blood delivery in its entirety. A review of cardiac surgery patients’ charts (n = 21) was conducted for analysis of CS waste. After identification of practices that were leading to CS waste, interventions were designed and implemented. Fishbone diagram, key driver diagram, Plan–Do–Study–Act (PDSA) cycles, and data collection forms were used throughout this QI process to track and guide progress regarding CS waste. Of patients under 6 kg (n = 5), 80% had wasted CS blood before interventions, whereas those patients larger than 36 kg (n = 8) had 25% wasted CS before interventions. Seventy-five percent of patients under 6 kg who had wasted CS blood received packed red blood cell transfusions in the cardiothoracic intensive care unit within 24 hours of their operation. After data collection and didactic education sessions (PDSA Cycle I), CS blood volume waste was reduced to 5% in all patients. Identification and analysis of the root cause followed by implementation of education, training, and management of change (PDSA Cycle II) resulted in successful use of 100% of all CS blood volume. PMID:24783313

  16. Changing methodology for measuring airborne radioactive discharges from nuclear facilities

    International Nuclear Information System (INIS)

    Glissmeyer, J.A.; Ligotke, M.W.

    1995-05-01

    The US Environmental Protection Agency (USEPA) requires that measurements of airborne radioactive discharges from nuclear facilities be performed following outdated methods contained in the American National Standards Institute (ANSI) N13.1-1969 Guide to Sampling Airborne Radioactive Materials in Nuclear Facilities. Improved methods are being introduced via two paths. First, the ANSI standard is being revised, and second, EPA's equivalency granting process is being used to implement new technology on a case-by-case or broad basis. The ANSI standard is being revised by a working group under the auspices of the Health Physics Society Standards Committee. The revised standard includes updated methods based on current technology and a performance-based approach to design. The performance-based standard will present new challenges, especially in the area of performance validation. Progress in revising the standard is discussed. The US Department of Energy recently received approval from the USEPA for an alternate approach to complying with air-sampling regulations. The alternate approach is similar to the revised ANSI standard. New design tools include new types of sample extraction probes and a model for estimating line-losses for particles and radioiodine. Wind tunnel tests are being performed on various sample extraction probes for use at small stacks. The data show that single-point sampling probes are superior to ANSI-Nl3.1-1969 style multiple-point sample extraction probes

  17. Proposition of Improved Methodology in Creep Life Extrapolation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Woo Gon; Park, Jae Young; Jang, Jin Sung [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    To design SFRs for a 60-year operation, it is desirable to have the experimental creep-rupture data for Gr. 91 steel close to 20 y, or at least rupture lives significantly higher than 10{sup 5} h. This requirement arises from the fact that, for the creep design, a factor of 3 times for extrapolation is considered to be appropriate. However, obtaining experimental data close to 20 y would be expensive and also take considerable time. Therefore, reliable creep life extrapolation techniques become necessary for a safe design life of 60 y. In addition, it is appropriate to obtain experimental longterm creep-rupture data in the range 10{sup 5} ∼ 2x10{sup 5} h to improve the reliability of extrapolation. In the present investigation, a new function of a hyperbolic sine ('sinh') form for a master curve in time-temperature parameter (TTP) methods, was proposed to accurately extrapolate the long-term creep rupture stress of Gr. 91 steel. Constant values used for each parametric equation were optimized on the basis of the creep rupture data. Average stress values predicted for up to 60 y were evaluated and compared with those of French Nuclear Design Code, RCC-MRx. The results showed that the master curve of the 'sinh' function was a wider acceptance with good flexibility in the low stress ranges beyond the experimental data. It was clarified clarified that the 'sinh' function was reasonable in creep life extrapolation compared with polynomial forms, which have been used conventionally until now.

  18. Proposition of Improved Methodology in Creep Life Extrapolation

    International Nuclear Information System (INIS)

    Kim, Woo Gon; Park, Jae Young; Jang, Jin Sung

    2016-01-01

    To design SFRs for a 60-year operation, it is desirable to have the experimental creep-rupture data for Gr. 91 steel close to 20 y, or at least rupture lives significantly higher than 10"5 h. This requirement arises from the fact that, for the creep design, a factor of 3 times for extrapolation is considered to be appropriate. However, obtaining experimental data close to 20 y would be expensive and also take considerable time. Therefore, reliable creep life extrapolation techniques become necessary for a safe design life of 60 y. In addition, it is appropriate to obtain experimental longterm creep-rupture data in the range 10"5 ∼ 2x10"5 h to improve the reliability of extrapolation. In the present investigation, a new function of a hyperbolic sine ('sinh') form for a master curve in time-temperature parameter (TTP) methods, was proposed to accurately extrapolate the long-term creep rupture stress of Gr. 91 steel. Constant values used for each parametric equation were optimized on the basis of the creep rupture data. Average stress values predicted for up to 60 y were evaluated and compared with those of French Nuclear Design Code, RCC-MRx. The results showed that the master curve of the 'sinh' function was a wider acceptance with good flexibility in the low stress ranges beyond the experimental data. It was clarified clarified that the 'sinh' function was reasonable in creep life extrapolation compared with polynomial forms, which have been used conventionally until now.

  19. Achieving Improvement Through Nursing Home Quality Measurement

    OpenAIRE

    Harris, Yael; Clauser, Steven B.

    2002-01-01

    CMS has initiated the Nursing Home Quality Initiative (NHQI) to improve the quality of nursing home care. Central to the NHQI is the public reporting of nursing home quality measures that serve as the basis for the Initiative's communication and quality improvement program. This article provides an overview of the NHQI, focusing on the role of nursing home quality measures in achieving improvements in nursing home care. We also describe the evolution of quality measurement in nursing homes, a...

  20. Methodology development for availability improvement of standby equipment

    International Nuclear Information System (INIS)

    Shin, Sung Min; Jeon, In Seop; Kang, Hyun Gook

    2014-01-01

    The core damage frequency (CDF) of operating and constructing pressurized nuclear plants are ranging on the order of 10 -5 and 10 -6 per year. The target CDF of new NPP design has been set at 10 -7 . In this context, although various systems are currently studied, availability improvement of standby equipment will be more efficient than the additional application of safety systems. It is obvious in every aspect, such as management and cost efficiency. Here, soundness can affect equipment unavailability, and the soundness degrades because of aging. However, some studies did not consider aging when calculating the unavailability. Standby equipment can age because of two important factors: standby stress which accumulates over time, and test stress which accumulates with the number of tests (or operations). Both factors should be considered together when aging is considered. However, some studies only considered standby stress or test stress. There are some previous studies which considered both factors. Besides equipment soundness related to aging effect, some process like bypass during test also can affect equipment unavailability because the original function of equipment cannot be performed immediately during this process. However, there are seldom studies dealing with above factors as a whole problem. This study investigated a general approach to calculate the unavailability of standby equipment which considers aging caused by standby and test stresses and bypass process. Based on this general approach, we propose two maintenance strategies which aim to reduce standby equipment unavailability. In section 2, the general approach is presented. As one of the strategies, the changing test interval method (CIM) is introduced in section 3, and its effectiveness is also analyzed. The online monitoring method (OMM) is investigated in section 4 as another method to reduce equipment unavailability. In section 5, a combination of these two methods is analyzed. A general

  1. A Multi-Methodology for improving Adelaide's Groundwater Management

    Science.gov (United States)

    Batelaan, Okke; Banks, Eddie; Batlle-Aguilar, Jordi; Breciani, Etienne; Cook, Peter; Cranswick, Roger; Smith, Stan; Turnadge, Chris; Partington, Daniel; Post, Vincent; Pool Ramirez, Maria; Werner, Adrian; Xie, Yueqing; Yang, Yuting

    2015-04-01

    Groundwater is a strategic and vital resource in South Australia playing a crucial role in sustaining a healthy environment, as well as supporting industries and economic development. In the Adelaide metropolitan region ten different aquifer units have been identified, extending to more than 500 m below sea level. Although salinity within most of these aquifers is variable, water suitable for commercial, irrigation and/or potable use is predominantly found in the deeper Tertiary aquifers. Groundwater currently contributes only 9000 ML/yr of Adelaide's total water consumption of 216,000 ML, while in the Northern Adelaide Plains 17000 ML/yr is used. However, major industries, market gardeners, golf courses, and local councils are highly dependent on this resource. Despite recent rapid expansion in managed aquifer recharge, and the potential for increased extraction of groundwater, particularly for the commercial and irrigation supplies, little is known about the sources and ages of Adelaide's groundwater. The aim of this study is therefore to provide a robust conceptualisation of Adelaide's groundwater system. The study focuses on three important knowledge gaps: 1. Does groundwater flow from the Adelaide Hills into the sedimentary aquifers on the plains? 2. What is the potential for encroachment of seawater if groundwater extraction increases? 3. How isolated are the different aquifers, or does water leak from one to the other? A multi-tool approach has been used to improve the conceptual understanding of groundwater flow processes; including the installation of new groundwater monitoring wells from the hills to the coast, an extensive groundwater sampling campaign of new and existing groundwater wells for chemistry and environmental tracers analysis, and development of a regional scale numerical model rigorously tested under different scenario conditions. The model allows quantification of otherwise hardly quantifiable quantities such as flow across fault zones and

  2. VAR Methodology Used for Exchange Risk Measurement and Prevention

    Directory of Open Access Journals (Sweden)

    Florentina Balu

    2006-05-01

    Full Text Available In this article we discuss one of the modern risk measuring techniques Value-at-Risk (VaR. Currently central banks in major money centers, under the auspices of the BIS Basle Committee, adopt the VaR system to evaluate the market risk of their supervised banks. Banks regulators ask all commercial banks to report VaRs with their internal models. Value at risk (VaR is a powerful tool for assessing market risk, but it also imposes a challenge. Its power is its generality. Unlike market risk metrics such as the Greeks, duration and convexity, or beta, which are applicable to only certain asset categories or certain sources of market risk, VaR is general. It is based on the probability distribution for a portfolio’s market value. Value at Risk (VAR calculates the maximum loss expected (or worst case scenario on an investment, over a given time period and given a specified degree of confidence. There are three methods by which VaR can be calculated: the historical simulation, the variance-covariance method and the Monte Carlo simulation. The variance-covariance method is easiest because you need to estimate only two factors: average return and standard deviation. However, it assumes returns are well-behaved according to the symmetrical normal curve and that historical patterns will repeat into the future. The historical simulation improves on the accuracy of the VAR calculation, but requires more computational data; it also assumes that “past is prologue”. The Monte Carlo simulation is complex, but has the advantage of allowing users to tailor ideas about future patterns that depart from historical patterns.

  3. VAR Methodology Used for Exchange Risk Measurement and Prevention

    Directory of Open Access Journals (Sweden)

    Ion Stancu

    2006-03-01

    Full Text Available In this article we discuss one of the modern risk measuring techniques Value-at-Risk (VaR. Currently central banks in major money centers, under the auspices of the BIS Basle Committee, adopt the VaR system to evaluate the market risk of their supervised banks. Banks regulators ask all commercial banks to report VaRs with their internal models. Value at risk (VaR is a powerful tool for assessing market risk, but it also imposes a challenge. Its power is its generality. Unlike market risk metrics such as the Greeks, duration and convexity, or beta, which are applicable to only certain asset categories or certain sources of market risk, VaR is general. It is based on the probability distribution for a portfolio’s market value. Value at Risk (VAR calculates the maximum loss expected (or worst case scenario on an investment, over a given time period and given a specified degree of confidence. There are three methods by which VaR can be calculated: the historical simulation, the variance-covariance method and the Monte Carlo simulation. The variance-covariance method is easiest because you need to estimate only two factors: average return and standard deviation. However, it assumes returns are well-behaved according to the symmetrical normal curve and that historical patterns will repeat into the future. The historical simulation improves on the accuracy of the VAR calculation, but requires more computational data; it also assumes that “past is prologue”. The Monte Carlo simulation is complex, but has the advantage of allowing users to tailor ideas about future patterns that depart from historical patterns.

  4. Measure a carbon impact methodology in line with a 2 degree scenario

    International Nuclear Information System (INIS)

    Coeslier, Manuel; Finidori, Esther; Smia, Ladislas

    2015-11-01

    the portfolio level while addressing instances of double-counting. By adopting a life-cycle vision that accounts for both induced and avoided emissions, the CIA methodology is a reliable tool for measuring the contribution of investments to the issues surrounding the energy transition. Once the diagnostics are made public, financial players will face increasing pressure to improve their carbon performance. Accordingly, in the long term this measure can influence greater action in low-carbon investment strategies. (authors)

  5. THE ASSESSMENT METHODOLOGIES PTELR, ADRI AND CAE – THREE METHODOLOGIES FOR COORDINATING THE EFFORTS TO IMPROVE THE ORGANIZATIONAL PROCESSES TO ACHIEVE EXCELLENCE

    OpenAIRE

    Cristina Raluca POPESCU; Gheorghe N. POPESCU

    2015-01-01

    In the paper “The Assessment Methodologies PTELR, ADRI and CAE – Three Methodologies for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodologies PTELR, ADRI and CAE that are designed to coordinate the efforts to improve the organizational processes in order to achieve excellence. In the first part of the paper (the introduction of the paper), the authors present the general background concer...

  6. Does Deregulation of Quality Standards in Telecomunications Improve Social Welfare? A Methodological Note Does Deregulation of Quality Standards in Telecomunications Improve Social Welfare? A Methodological Note

    OpenAIRE

    Felipe Morandé

    1990-01-01

    Does Deregulation of Quality Standards in Telecomunications Improve Social Welfare? A Methodological Note One of the main reasons behind the bit difference observed in the per capita number of telephones between develope and developing countries is the high capital cost -a scarce resource in LDC's- of expanding telecommunications infrastructure. A reasonable question to raise in this context is the extent to which that high capital cost of investment could be diminished if international quali...

  7. Using Quality Tools and Methodologies to Improve a Hospital's Quality Position.

    Science.gov (United States)

    Branco, Daniel; Wicks, Angela M; Visich, John K

    2017-01-01

    The authors identify the quality tools and methodologies most frequently used by quality-positioned hospitals versus nonquality hospitals. Northeastern U.S. hospitals in both groups received a brief, 12-question survey. The authors found that 93.75% of the quality hospitals and 81.25% of the nonquality hospitals used some form of process improvement methodologies. However, there were significant differences between the groups regarding the impact of quality improvement initiatives on patients. The findings indicate that in quality hospitals the use of quality improvement initiatives had a significantly greater positive impact on patient satisfaction and patient outcomes when compared to nonquality hospitals.

  8. Proposed methodology for estimating the impact of highway improvements on urban air pollution.

    Science.gov (United States)

    1971-01-01

    The aim of this methodology is to indicate the expected change in ambient air quality in the vicinity of a highway improvement and in the total background level of urban air pollution resulting from the highway improvement. Both the jurisdiction in w...

  9. New methodology of measurement the unsteady thermal cooling of objects

    Science.gov (United States)

    Winczek, Jerzy

    2018-04-01

    The problems of measurements of unsteady thermal turbulent flow affect a many of domains, such as heat energy, manufacturing technologies, and many others. The subject of the study is focused on the analysis of current state of the problem, overview of the design solutions and methods to measure non-stationary thermal phenomena, presentation, and choice of adequate design of the cylinder, development of the method to measure and calculate basic values that characterize the process of heat exchange on the model surface.

  10. The 5S methodology as a tool for improving the organisation

    OpenAIRE

    J. Michalska; D. Szewieczek

    2007-01-01

    Purpose: The aim of this paper is showing the 5S methodology. In this paper it was introduced the way of implementing the 5S methodology in the company.Design/methodology/approach: In the frames of own research it has been analysed and implemented the 5S rules in the production process.Findings: On the basis of the own research it can be stated, that introducing the 5S rules bring the great changes in the company, for example: process improvement by costs’ reduction, increasing of effectivene...

  11. Covariance methodology applied to 35S disintegration rate measurements by the CIEMAT/NIST method

    International Nuclear Information System (INIS)

    Koskinas, M.F.; Nascimento, T.S.; Yamazaki, I.M.; Dias, M.S.

    2014-01-01

    The Nuclear Metrology Laboratory (LMN) at IPEN is carrying out measurements in a LSC (Liquid Scintillation Counting system), applying the CIEMAT/NIST method. In this context 35 S is an important radionuclide for medical applications and it is difficult to be standardized by other primary methods due to low beta ray energy. The CIEMAT/NIST is a standard technique used by most metrology laboratories in order to improve accuracy and speed up beta emitter standardization. The focus of the present work was to apply the covariance methodology for determining the overall uncertainty in the 35 S disintegration rate. All partial uncertainties involved in the measurements were considered, taking into account all possible correlations between each pair of them. - Highlights: ► 35 S disintegration rate measured in Liquid Scintillator system using CIEMAT/NIST method. ► Covariance methodology applied to the overall uncertainty in the 35 S disintegration rate. ► Monte Carlo simulation was applied to determine 35 S activity in the 4πβ(PC)-γ coincidence system

  12. Improving the Understanding of Research Methodology and Self-Regulated Learning Through Blog Project

    OpenAIRE

    Retnawati, Heri

    2017-01-01

    : This classroom action research seeks to improve self-regulated learning (SRL) and understanding of research methodology at the graduate school. Nineteen graduate school students were involved. Using project-based learning (PjBL), students were assigned to create online blogs as the main project. The blog was intended for representing their understanding of research methodology by writing review of research articles and submitting a research proposal. The classroom action research was based ...

  13. Presentation of a methodology for measuring social acceptance of three hydrogen storage technologies and preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Noirot, I.; Bigay, C. N.

    2005-07-01

    Technologies (MASIT). This methodology takes into account the following points of view : technical, economical, environmental, social and industrial/technological risks. MASIT is the methodology chosen to assess the hydrogen storage technologies developed during the StorHy project. With respect to the methodology, each point of view is defined by several criteria selected with car manufacturers and experts of each domain. Then, each criterion is quantified with the contribution of all partners involved in the project. While technical, economical and environmental criteria are quite objectives (easy to define and measure), the social dimension is subjective and has also a large variability as it depends on perception and measurement at an individual human level. So, the methodological work consists in the improvement of the MASIT methodology from the social point of view. This methodology is applicable for comparison of any other technologies and it has been implemented here to compare the storage technologies developed in the StorHy project for each application selected in the study (light vehicles, fleet vehicles, buses). (Author)

  14. Measurement of testosterone in human sexuality research: methodological considerations.

    Science.gov (United States)

    van Anders, Sari M; Goldey, Katherine L; Bell, Sarah N

    2014-02-01

    Testosterone (T) and other androgens are incorporated into an increasingly wide array of human sexuality research, but there are a number of issues that can affect or confound research outcomes. This review addresses various methodological issues relevant to research design in human studies with T; unaddressed, these issues may introduce unwanted noise, error, or conceptual barriers to interpreting results. Topics covered are (1) social and demographic factors (gender and sex; sexual orientations and sexual diversity; social/familial connections and processes; social location variables), (2) biological rhythms (diurnal variation; seasonality; menstrual cycles; aging and menopause), (3) sample collection, handling, and storage (saliva vs. blood; sialogogues, saliva, and tubes; sampling frequency, timing, and context; shipping samples), (4) health, medical issues, and the body (hormonal contraceptives; medications and nicotine; health conditions and stress; body composition, weight, and exercise), and (5) incorporating multiple hormones. Detailing a comprehensive set of important issues and relevant empirical evidence, this review provides a starting point for best practices in human sexuality research with T and other androgens that may be especially useful for those new to hormone research.

  15. Measurement methodology of natural radioactivity in the thermal establishments

    International Nuclear Information System (INIS)

    Ameon, R.; Robe, M.C.

    2004-11-01

    The thermal baths have been identified as an activity susceptible to expose to ionizing radiations the workers through the natural sources of radon and radon 220. The new regulation obliges these facilities to realize radioactivity measurements. The principal ways of exposure are radon and its daughters inhalation,, exposure to gamma radiation, ingestion of radioelements in thermal waters. I.R.S.N. proposes two methods of measurements of the natural radioactivity in application to the regulation relative to the protection of persons and workers. Some principles to reduce exposure to radon are reminded. (N.C.)

  16. Measuring collections effort improves cash performance.

    Science.gov (United States)

    Shutts, Joe

    2009-09-01

    Having a satisfied work force can lead to an improved collections effort. Hiring the right people and training them ensures employee engagement. Measuring collections effort and offering incentives is key to revenue cycle success.

  17. Measurement and verification of low income energy efficiency programs in Brazil: Methodological challenges

    Energy Technology Data Exchange (ETDEWEB)

    Martino Jannuzzi, Gilberto De; Rodrigues da Silva, Ana Lucia; Melo, Conrado Augustus de; Paccola, Jose Angelo; Dourado Maia Gomes, Rodolfo (State Univ. of Campinas, International Energy Initiative (Brazil))

    2009-07-01

    Electric utilities in Brazil are investing about 80 million dollars annually in low-income energy efficiency programs, about half of their total compulsory investments in end-use efficiency programs under current regulation. Since 2007 the regulator has enforced the need to provide evaluation plans for the programs delivered. This paper presents the measurement and verification (MandV) methodology that has been developed to accommodate the characteristics of lighting and refrigerator programs that have been introduced in the Brazilian urban and peri-urban slums. A combination of household surveys, end-use measurements and metering at the transformers and grid levels were performed before and after the program implementation. The methodology has to accommodate the dynamics, housing, electrical wiring and connections of the population as well as their ability to pay for the electricity and program participation. Results obtained in slums in Rio de Janeiro are presented. Impacts of the programs were evaluated in energy terms to households and utilities. Feedback from the evaluations performed also permitted the improvement in the design of new programs for low-income households.

  18. Laser scattering methodology for measuring particulates in the air

    Directory of Open Access Journals (Sweden)

    Carlo Giglioni

    2009-03-01

    Full Text Available A description is given of the laser scattering method to measure PM10, PM2.5 and PM1 dusts in confirmed environments (museums, libraries, archives, art galleries, etc.. Such equipment presents many advantages, in comparison with those which are actually in use, not only from an analytic but also from a functional point of view.

  19. Sensible organizations: technology and methodology for automatically measuring organizational behavior.

    Science.gov (United States)

    Olguin Olguin, Daniel; Waber, Benjamin N; Kim, Taemie; Mohan, Akshay; Ara, Koji; Pentland, Alex

    2009-02-01

    We present the design, implementation, and deployment of a wearable computing platform for measuring and analyzing human behavior in organizational settings. We propose the use of wearable electronic badges capable of automatically measuring the amount of face-to-face interaction, conversational time, physical proximity to other people, and physical activity levels in order to capture individual and collective patterns of behavior. Our goal is to be able to understand how patterns of behavior shape individuals and organizations. By using on-body sensors in large groups of people for extended periods of time in naturalistic settings, we have been able to identify, measure, and quantify social interactions, group behavior, and organizational dynamics. We deployed this wearable computing platform in a group of 22 employees working in a real organization over a period of one month. Using these automatic measurements, we were able to predict employees' self-assessments of job satisfaction and their own perceptions of group interaction quality by combining data collected with our platform and e-mail communication data. In particular, the total amount of communication was predictive of both of these assessments, and betweenness in the social network exhibited a high negative correlation with group interaction satisfaction. We also found that physical proximity and e-mail exchange had a negative correlation of r = -0.55 (p 0.01), which has far-reaching implications for past and future research on social networks.

  20. Methodological Issues in Measures of Imitative Reaction Times

    Science.gov (United States)

    Aicken, Michael D.; Wilson, Andrew D.; Williams, Justin H. G.; Mon-Williams, Mark

    2007-01-01

    Ideomotor (IM) theory suggests that observing someone else perform an action activates an internal motor representation of that behaviour within the observer. Evidence supporting the case for an ideomotor theory of imitation has come from studies that show imitative responses to be faster than the same behavioural measures performed in response to…

  1. Investigation of an Error Theory for Conjoint Measurement Methodology.

    Science.gov (United States)

    1983-05-01

    1ybren, 1982; Srinivasan and Shocker, 1973a, 1973b; Ullrich =d Cumins , 1973; Takane, Young, and de Leeui, 190C; Yount,, 1972’. & OEM...procedures as a diagnostic tool. Specifically, they used the oompted STRESS - value and a measure of fit they called PRECAP that could be obtained

  2. Flux Measurements in Trees: Methodological Approach and Application to Vineyards

    Directory of Open Access Journals (Sweden)

    Francesca De Lorenzi

    2008-03-01

    Full Text Available In this paper a review of two sap flow methods for measuring the transpiration in vineyards is presented. The objective of this work is to examine the potential of detecting transpiration in trees in response to environmental stresses, particularly the high concentration of ozone (O3 in troposphere. The methods described are the stem heat balance and the thermal dissipation probe; advantages and disadvantages of each method are detailed. Applications of both techniques are shown, in two large commercial vineyards in Southern Italy (Apulia and Sicily, submitted to semi-arid climate. Sap flow techniques allow to measure transpiration at plant scale and an upscaling procedure is necessary to calculate the transpiration at the whole stand level. Here a general technique to link the value of transpiration at plant level to the canopy value is presented, based on experimental relationships between transpiration and biometric characteristics of the trees. In both vineyards transpiration measured by sap flow methods compares well with evapotranspiration measured by micrometeorological techniques at canopy scale. Moreover soil evaporation component has been quantified. In conclusion, comments about the suitability of the sap flow methods for studying the interactions between trees and ozone are given.

  3. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    Science.gov (United States)

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  4. New scoring methodology improves the sensitivity of the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) in clinical trials.

    Science.gov (United States)

    Verma, Nishant; Beretvas, S Natasha; Pascual, Belen; Masdeu, Joseph C; Markey, Mia K

    2015-11-12

    As currently used, the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) has low sensitivity for measuring Alzheimer's disease progression in clinical trials. A major reason behind the low sensitivity is its sub-optimal scoring methodology, which can be improved to obtain better sensitivity. Using item response theory, we developed a new scoring methodology (ADAS-CogIRT) for the ADAS-Cog, which addresses several major limitations of the current scoring methodology. The sensitivity of the ADAS-CogIRT methodology was evaluated using clinical trial simulations as well as a negative clinical trial, which had shown an evidence of a treatment effect. The ADAS-Cog was found to measure impairment in three cognitive domains of memory, language, and praxis. The ADAS-CogIRT methodology required significantly fewer patients and shorter trial durations as compared to the current scoring methodology when both were evaluated in simulated clinical trials. When validated on data from a real clinical trial, the ADAS-CogIRT methodology had higher sensitivity than the current scoring methodology in detecting the treatment effect. The proposed scoring methodology significantly improves the sensitivity of the ADAS-Cog in measuring progression of cognitive impairment in clinical trials focused in the mild-to-moderate Alzheimer's disease stage. This provides a boost to the efficiency of clinical trials requiring fewer patients and shorter durations for investigating disease-modifying treatments.

  5. Methodological proposal for the definition of improvement strategies in logistics of SME

    Directory of Open Access Journals (Sweden)

    Yeimy Liseth Becerra

    2014-12-01

    Full Text Available A methodological proposal for defining strategies of improvement in logistics of SMEs is presented as a means to fulfill a specific objective of the project Methodological design on storage logistics, acquisition, ownership of information systems and communication for Colombian SMEs, baker subsector, which currently runs the research group SEPRO, of Universidad Nacional of Colombia and supported by Colciencias. The project corresponds to the completion of the last stage of the base project, and aims to implement the corresponding target, raised in the research project that has been developing the research group SEPRO. To do this, it was made a review of the methodology used during the execution of the basic project, as well as the state of the art of techniques used in similar research for the evaluation and definition of breeding strategies in SMEs logistics. Revised techniques were compared and a proposed methodology was configured, which consists of the techniques that represented the greatest advantages for the research development.

  6. Technological measures to improve automotive product quality

    OpenAIRE

    Gladkov, V.; Kruglov, S.

    2010-01-01

    The paper examines the basic technological measures aimed at improving product quality in automotive industry. While paying due attention to solving organizational and technological problems, including the development of certification systems for production processes, it is also necessary to improve the technical standards of specific technologies, equipment and materials as they largely determine product quality. Special emphasis is given to the importance of improving the production of auto...

  7. The relevance of segments reports – measurement methodology

    Directory of Open Access Journals (Sweden)

    Tomasz Zimnicki

    2017-09-01

    Full Text Available The segment report is one of the areas of financial statements, and it obliges a company to provide infor-mation about the economic situation in each of its activity areas. The article evaluates the change of segment reporting standards from IAS14R to IFRS8 in the context of feature relevance. It presents the construction of a measure which allows the relevance of segment disclosures to be determined. The created measure was used to study periodical reports published by companies listed on the main market of the Warsaw Stock Exchange from three reporting periods – 2008, 2009 and 2013. Based on the re-search results, it was found that the change of segment reporting standards from IAS14R to IFRS8 in the context of relevance was legitimate.

  8. Improving hospital discharge time: a successful implementation of Six Sigma methodology.

    Science.gov (United States)

    El-Eid, Ghada R; Kaddoum, Roland; Tamim, Hani; Hitti, Eveline A

    2015-03-01

    Delays in discharging patients can impact hospital and emergency department (ED) throughput. The discharge process is complex and involves setting specific challenges that limit generalizability of solutions. The aim of this study was to assess the effectiveness of using Six Sigma methods to improve the patient discharge process. This is a quantitative pre and post-intervention study. Three hundred and eighty-six bed tertiary care hospital. A series of Six Sigma driven interventions over a 10-month period. The primary outcome was discharge time (time from discharge order to patient leaving the room). Secondary outcome measures included percent of patients whose discharge order was written before noon, percent of patients leaving the room by noon, hospital length of stay (LOS), and LOS of admitted ED patients. Discharge time decreased by 22.7% from 2.2 hours during the preintervention period to 1.7 hours post-intervention (P Six Sigma methodology can be an effective change management tool to improve discharge time. The focus of institutions aspiring to tackle delays in the discharge process should be on adopting the core principles of Six Sigma rather than specific interventions that may be institution-specific.

  9. Clinical audit, a valuable tool to improve quality of care: General methodology and applications in nephrology

    Science.gov (United States)

    Esposito, Pasquale; Dal Canton, Antonio

    2014-01-01

    Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings. PMID:25374819

  10. Development of an Improved Methodology to Assess Potential Unconventional Gas Resources

    International Nuclear Information System (INIS)

    Salazar, Jesus; McVay, Duane A.; Lee, W. John

    2010-01-01

    Considering the important role played today by unconventional gas resources in North America and their enormous potential for the future around the world, it is vital to both policy makers and industry that the volumes of these resources and the impact of technology on these resources be assessed. To provide for optimal decision making regarding energy policy, research funding, and resource development, it is necessary to reliably quantify the uncertainty in these resource assessments. Since the 1970s, studies to assess potential unconventional gas resources have been conducted by various private and governmental agencies, the most rigorous of which was by the United States Geological Survey (USGS). The USGS employed a cell-based, probabilistic methodology which used analytical equations to calculate distributions of the resources assessed. USGS assessments have generally produced distributions for potential unconventional gas resources that, in our judgment, are unrealistically narrow for what are essentially undiscovered, untested resources. In this article, we present an improved methodology to assess potential unconventional gas resources. Our methodology is a stochastic approach that includes Monte Carlo simulation and correlation between input variables. Application of the improved methodology to the Uinta-Piceance province of Utah and Colorado with USGS data validates the means and standard deviations of resource distributions produced by the USGS methodology, but reveals that these distributions are not right skewed, as expected for a natural resource. Our investigation indicates that the unrealistic shape and width of the gas resource distributions are caused by the use of narrow triangular input parameter distributions. The stochastic methodology proposed here is more versatile and robust than the USGS analytic methodology. Adoption of the methodology, along with a careful examination and revision of input distributions, should allow a more realistic

  11. Validation of SEACAB Methodology with Frascati (FNG) Photon Dose Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tore, C.; Ortego, P.; Rodriguez Rivada, A.

    2014-07-01

    In the operation of the International Thermonuclear Experimental Reactor (ITER) the correct estimation of the gamma dose rate produced from the structural materials after shut down is one of the important safety parameter for hands-on maintenance. SEACAB, a rigorous 2-step (R2S) computational method has been developed for the calculation of residual dose in 3-D geometry with the use of the MCNP5 and of the ACAB (ACtivation ABacus) inventory code. The method is very efficient in hardware requirements being essentially modular. Starting from a single MCNP5 run permits a progressive improvement in the spatial detail of the material layers for the activation calculation and obtains separated source distributions for the isotopes contributing to the photon dose. (Author)

  12. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology

  13. Measuring covariation in RNA alignments: Physical realism improves information measures

    DEFF Research Database (Denmark)

    Lindgreen, Stinus; Gardner, Paul Phillip; Krogh, Anders

    2006-01-01

    Motivation: The importance of non-coding RNAs is becoming increasingly evident, and often the function of these molecules depends on the structure. It is common to use alignments of related RNA sequences to deduce the consensus secondary structure by detecting patterns of co-evolution. A central...... part of such an analysis is to measure covariation between two positions in an alignment. Here, we rank various measures ranging from simple mutual information to more advanced covariation measures. Results: Mutual information is still used for secondary structure prediction, but the results...... of this study indicate which measures are useful. Incorporating more structural information by considering e.g. indels and stacking improves accuracy, suggesting that physically realistic measures yield improved predictions. This can be used to improve both current and future programs for secondary structure...

  14. Spectral reflectance measurement methodologies for TUZ Golu field campaign

    CSIR Research Space (South Africa)

    Boucher, Y

    2011-07-01

    Full Text Available panel. However, it's possible to take this into account in the uncertainty budget. 2.2. Instrumentation and sampling area All of the teams except INPE used a Fieldspec ASD spectroradiometer. In this case, the user has to choose the aperture... of the objective and the ASD configuration (the number of elementary spectra averaged to get one measurement, here typically 10, and the number of dark current acquisitions, here typically 25). The spectroradiometer must also be optimized from time to time...

  15. Improved method of measurement for outer leak

    International Nuclear Information System (INIS)

    Xu Guang

    2012-01-01

    Pneumatic pipeline is installed for the airborne radioactivity measurement equipment, air tightness and outer leak rate are essential for the testing of the characteristics, both in the national criteria and ISO standards, an improved practical method is available for the measurement of the outer air leak rate based on the engineering experiences for the equipment acceptance and testing procedure. (authors)

  16. Improving Students' Understanding of Quantum Measurement

    International Nuclear Information System (INIS)

    Zhu Guangtian; Singh, Chandralekha

    2010-01-01

    We describe the difficulties advanced undergraduate and graduate students have with quantum measurement. To reduce these difficulties, we have developed research-based learning tools such as the Quantum Interactive Learning Tutorial (QuILT) and peer instruction tools. A preliminary evaluation shows that these learning tools are effective in improving students' understanding of concepts related to quantum measurement.

  17. Performance Evaluation and Measurement of the Organization in Strategic Analysis and Control: Methodological Aspects

    OpenAIRE

    Živan Ristić; Neđo Balaban

    2006-01-01

    Information acquired by measuring and evaluation are a necessary condition for good decision-making in strategic management. This work deals with : (a) Methodological aspects of evaluation (kinds of evaluation, metaevaluation) and measurement (supposition of isomorphism in measurement, kinds and levels of measurement, errors in measurement and the basic characteristics of measurement) (b) Evaluation and measurement of potential and accomplishments of the organization in Kaplan-Norton perspect...

  18. A Scoping Review of Clinical Practice Improvement Methodology Use in Rehabilitation

    Directory of Open Access Journals (Sweden)

    Marie-Eve Lamontagne

    2016-01-01

    Full Text Available Context The Clinical Practice Improvement (CPI approach is a methodological and quality improvement approach that has emerged and is gaining in popularity. However, there is no systematic description of its use or the determinants of its practice in rehabilitation settings. Method We performed a scoping review of the use of CPI methodology in rehabilitation settings. Results A total of 103 articles were reviewed. We found evidence of 13 initiatives involving CPI with six different populations. A total of 335 citations of determinants were found, with 68.7% related to CPI itself. Little information was found about what type of external and internal environment, individual characteristics and implementation process might facilitate or hinder the use of CPI. Conclusion Given the growing popularity of this methodological approach, CPI initiatives would gain from increasing knowledge of the determinants of its success and incorporating them in future implementation.

  19. Methodological issues in measures of imitative reaction times.

    Science.gov (United States)

    Aicken, Michael D; Wilson, Andrew D; Williams, Justin H G; Mon-Williams, Mark

    2007-04-01

    Ideomotor (IM) theory suggests that observing someone else perform an action activates an internal motor representation of that behaviour within the observer. Evidence supporting the case for an ideomotor theory of imitation has come from studies that show imitative responses to be faster than the same behavioural measures performed in response to spatial cues. In an attempt to replicate these findings, we manipulated the salience of the visual cue and found that we could reverse the advantage of the imitative cue over the spatial cue. We suggest that participants utilised a simple visuomotor mechanism to perform all aspects of this task, with performance being driven by the relative visual salience of the stimuli. Imitation is a more complex motor skill that would constitute an inefficient strategy for rapid performance.

  20. [Measuring nursing care times--methodologic and documentation problems].

    Science.gov (United States)

    Bartholomeyczik, S; Hunstein, D

    2001-08-01

    The time for needed nursing care is one important measurement as a basic for financing care. In Germany the Long Term Care Insurance (LTCI) reimburses nursing care depending on the time family care givers need to complete selected activities. The LTCI recommends certain time ranges for these activities, which are wholly compensatory, as a basic for assessment. The purpose is to enhance assessment justice and comparability. With the example of a German research project, which had to investigate the duration of these activities and the reasons for differences, questions are raised about some definition and interpretation problems. There are definition problems, since caring activities especially in private households are nearly never performed as clearly defined modules. Moreover, often different activities are performed simultaneously. However, the most important question is what exactly time numbers can say about the essentials of nursing care.

  1. Methodological NMR imaging developments to measure cerebral perfusion

    International Nuclear Information System (INIS)

    Pannetier, N.

    2010-12-01

    This work focuses on acquisition techniques and physiological models that allow characterization of cerebral perfusion by MRI. The arterial input function (AIF), on which many models are based, is measured by a technique of optical imaging at the carotid artery in rats. The reproducibility and repeatability of the AIF are discussed and a model function is proposed. Then we compare two techniques for measuring the vessel size index (VSI) in rats bearing a glioma. The reference technique, using a USPIO contrast agent (CA), faces the dynamic approach that estimates this parameter during the passage of a bolus of Gd. This last technique has the advantage of being used clinically. The results obtained at 4.7 T by both approaches are similar and use of VSI in clinical protocols is strongly encouraged at high field. The mechanisms involved (R1 and R2* relaxivities) were then studied using a multi gradient -echoes approach. A multi-echoes spiral sequence is developed and a method that allows the refocusing between each echo is presented. This sequence is used to characterize the impact of R1 effects during the passage of two successive injections of Gd. Finally, we developed a tool for simulating the NMR signal on a 2D geometry taking into account the permeability of the BBB and the CA diffusion in the interstitial space. At short TE, the effect of diffusion on the signal is negligible. In contrast, the effects of diffusion and permeability may be separated at long echo time. Finally we show that during the extravasation of the CA, the local magnetic field homogenization due to the decrease of the magnetic susceptibility difference at vascular interfaces is quickly balanced by the perturbations induced by the increase of the magnetic susceptibility difference at the cellular interfaces in the extravascular compartment. (author)

  2. Building an integrated methodology of learning that can optimally support improvements in healthcare.

    Science.gov (United States)

    Lynn, Joanne

    2011-04-01

    The methods for healthcare reform are strikingly underdeveloped, with much reliance on political power. A methodology that combined methods from sources such as clinical trials, experience-based wisdom, and improvement science could be among the aims of the upcoming work in the USA on comparative effectiveness and on the agenda of the Center for Medicare and Medicaid Innovation in the Centers for Medicare and Medicaid Services. Those working in quality improvement have an unusual opportunity to generate substantial input into these processes through professional organisations such as the Academy for Healthcare Improvement and dominant leadership organisations such as the Institute for Healthcare Improvement.

  3. Measurement of plasma adenosine concentration: methodological and physiological considerations

    International Nuclear Information System (INIS)

    Gewirtz, H.; Brown, P.; Most, A.S.

    1987-01-01

    This study tested the hypothesis that measurements of plasma adenosine concentration made on samples of blood obtained in dipyridamole and EHNA (i.e., stopping solution) may be falsely elevated as a result of ongoing in vitro production and accumulation of adenosine during sample processing. Studies were performed with samples of anticoagulated blood obtained from anesthesized domestic swine. Adenosine concentration of ultra filtrated plasma was determined by HPLC. The following parameters were evaluated: (i) rate of clearance of [ 3 H]adenosine added to plasma, (ii) endogenous adenosine concentration of matched blood samples obtained in stopping solution alone, stopping solution plus EDTA, and perchloric acid (PCA), (iii) plasma and erythrocyte endogenous adenosine concentration in nonhemolyzed samples, and (iv) plasma adenosine concentration of samples hemolyzed in the presence of stopping solution alone or stopping solution plus EDTA. We observed that (i) greater than or equal to 95% of [ 3 H]adenosine added to plasma is removed from it by formed elements of the blood in less than 20 s, (ii) plasma adenosine concentration of samples obtained in stopping solution alone is generally 10-fold greater than that of matched samples obtained in stopping solution plus EDTA, (iii) deliberate mechanical hemolysis of blood samples obtained in stopping solution alone resulted in substantial augmentation of plasma adenosine levels in comparison with matched nonhemolyzed specimens--addition of EDTA to stopping solution prevented this, and (iv) adenosine content of blood samples obtained in PCA agreed closely with the sum of plasma and erythrocyte adenosine content of samples obtained in stopping solution plus EDTA

  4. Methodologies for Measuring Judicial Performance: The Problem of Bias

    Directory of Open Access Journals (Sweden)

    Jennifer Elek

    2014-12-01

    Full Text Available Concerns about gender and racial bias in the survey-based evaluations of judicial performance common in the United States have persisted for decades. Consistent with a large body of basic research in the psychological sciences, recent studies confirm that the results from these JPE surveys are systematically biased against women and minority judges. In this paper, we explain the insidious manner in which performance evaluations may be biased, describe some techniques that may help to reduce expressions of bias in judicial performance evaluation surveys, and discuss the potential problem such biases may pose in other common methods of performance evaluation used in the United States and elsewhere. We conclude by highlighting the potential adverse consequences of judicial performance evaluation programs that rely on biased measurements. Durante décadas ha habido una preocupación por la discriminación por género y racial en las evaluaciones del rendimiento judicial basadas en encuestas, comunes en Estados Unidos. De acuerdo con un gran corpus de investigación básica en las ciencias psicológicas, estudios recientes confirman que los resultados de estas encuestas de evaluación del rendimiento judicial están sistemáticamente sesgados contra las mujeres y los jueces de minorías. En este artículo se explica la manera insidiosa en que las evaluaciones de rendimiento pueden estar sesgadas, se describen algunas técnicas que pueden ayudar a reducir las expresiones de sesgo en los estudios de evaluación del rendimiento judicial, y se debate el problema potencial que estos sesgos pueden plantear en otros métodos comunes de evaluación del rendimiento utilizados en Estados Unidos y otros países. Se concluye destacando las posibles consecuencias adversas de los programas de evaluación del rendimiento judicial que se basan en mediciones sesgadas. DOWNLOAD THIS PAPER FROM SSRN: http://ssrn.com/abstract=2533937

  5. Improving the Measurement of Earnings Dynamics

    DEFF Research Database (Denmark)

    Daly, Moira K.; Hryshko, Dmytro; Manovskii, Iourii

    The stochastic process for earnings is the key element of incomplete markets models in modern quantitative macroeconomics. We show that a simple modification of the canonical process used in the literature leads to a dramatic improvement in the measurement of earnings dynamics in administrative....... Accounting for these effects enables more accurate analysis using quantitative models with permanent and transitory earnings risk, and improves empirical estimates of consumption insurance against permanent earnings shocks....

  6. Measures for energy efficiency improvement of buildings

    Directory of Open Access Journals (Sweden)

    Vukadinović Ana V.

    2015-01-01

    Full Text Available The increase in energy consumption in buildings causes the need to propose energy efficiency improvement measures. Urban planning in accordance with micro location conditions can lead to energy consumption reduction in buildings through the passive solar design. While satisfying the thermal comfort to the user space purpose, energy efficiency can be achieved by optimizing the architectural and construction parameters such as shape of the building, envelope structure and the percentage of glazing. The improvement of the proposed measures, including the use of renewable energy sources, can meet requirements of Directive 2010/31 / EU of 'nearly zero energy buildings'.

  7. An Improved Setpoint Determination Methodology for the Plant Protection System Considering Beyond Design Basis Events

    International Nuclear Information System (INIS)

    Lee, C.J.; Baik, K.I.; Baek, S.M.; Park, K.-M.; Lee, S.J.

    2013-06-01

    According to the nuclear regulations and industry standards, the trip setpoint and allowable value for the plant protection system have been determined by considering design basis events. In order to improve the safety of a nuclear power plant, an attempt has been made to develop an improved setpoint determination methodology for the plant protection system trip parameter considering not only a design basis event but also a beyond design basis event. The results of a quantitative evaluation performed for the Advanced Power Reactor 1400 nuclear power plant in Korea are presented herein. The results confirmed that the proposed methodology is able to improve the nuclear power plant's safety by determining more reasonable setpoints that can cover beyond design basis events. (authors)

  8. Integration of an iterative methodology for exergoeconomic improvement of thermal systems with a process simulator

    International Nuclear Information System (INIS)

    Vieira, Leonardo S.; Donatelli, Joao L.; Cruz, Manuel E.

    2004-01-01

    In this paper, we present the development and automated implementation of an iterative methodology for exergoeconomic improvement of thermal systems integrated with a process simulator, so as to be applicable to real, complex plants. The methodology combines recent available exergoeconomic techniques with new qualitative and quantitative criteria for the following tasks: (i) identification of decision variables that affect system total cost and exergetic efficiency; (ii) hierarchical classification of components; (iii) identification of predominant terms in the component total cost; and (iv) choice of main decision variables in the iterative process. To show the strengths and potential advantages of the proposed methodology, it is here applied to the benchmark CGAM cogeneration system. The results obtained are presented and discussed in detail and are compared to those reached using a mathematical optimization procedure

  9. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    Science.gov (United States)

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  10. Methodological possibilities for using the electron and ion energy balance in thermospheric complex measurements

    International Nuclear Information System (INIS)

    Serafimov, K.B.; Serafimova, M.K.

    1991-01-01

    Combination of ground based measurements for determination of basic thermospheric characteristics is proposed . An expression for the energy transport between components of space plasma is also derived and discussed within the framework of the presented methodology which could be devided into the folowing major sections: 1) application of ionosonde, absorption measurements, TEC-measurements using Faradey radiation or the differential Doppler effect; 2) ground-based airglow measurements; 3) airglow and palsma satelite measurements. 9 refs

  11. The IAEA research project on improvement of safety assessment methodologies for near surface disposal facilities

    International Nuclear Information System (INIS)

    Torres-Vidal, C.; Graham, D.; Batandjieva, B.

    2002-01-01

    The International Atomic Energy Agency (IAEA) Research Coordinated Project on Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities (ISAM) was launched in November 1997 and it has been underway for three years. The ISAM project was developed to provide a critical evaluation of the approaches and tools used in long-term safety assessment of near surface repositories. It resulted in the development of a harmonised approach and illustrated its application by way of three test cases - vault, borehole and Radon (a particular range of repository designs developed within the former Soviet Union) type repositories. As a consequence, the ISAM project had over 70 active participants and attracted considerable interest involving around 700 experts from 72 Member States. The methodology developed, the test cases, the main lessons learnt and the conclusions have been documented and will be published in the form of an IAEA TECDOC. This paper presents the work of the IAEA on improvement of safety assessment methodologies for near surface waste disposal facilities and the application of these methodologies for different purposes in the individual stages of the repository development. The paper introduces the main objectives, activities and outcome of the ISAM project and summarizes the work performed by the six working groups within the ISAM programme, i.e. Scenario Generation and Justification, Modelling, Confidence Building, Vault, Radon Type Facility and Borehole test cases. (author)

  12. Improvement of Ultrasonic Distance Measuring System

    Directory of Open Access Journals (Sweden)

    Jiang Yu

    2018-01-01

    Full Text Available This paper mainly introduces a kind of ultrasonic distance measuring system with AT89C51 single chip as the core component. The paper expounds the principle of ultrasonic sensor and ultrasonic ranging, hardware circuit and software program, and the results of experiment and analysis.The hardware circuit based on SCM, the software design adopts the advanced microcontroller programming language.The amplitude of the received signal and the time of ultrasonic propagation are regulated by closed loop control. [1,2]The double closed loop control technology for amplitude and time improves the measuring accuracy of the instrument. The experimental results show that greatly improves the measurement accuracy of the system.

  13. Measuring a hospital's ability to improve.

    Science.gov (United States)

    Meurer, Steven J; Counte, Michael A; Rubio, Doris M; Arrington, Barbara

    2004-01-01

    The aim of this study was to test whether a recently developed measure of Continuous Quality Improvement (CQI) implementation can provide health care researchers and administrators with a tool to assist in understanding and with developing an appropriate structure for improvement efforts in hospitals. Two hundred respondents from 40 Missouri hospitals completed a 28-item survey addressing 8 domains of CQI. Overall, hospital scores showed low implementation of a structure that supports improvement efforts. All survey domains showed acceptable psychometric results. Leadership proved to be the most important domain of CQI because it differentiated well between all levels of the scale. Because of its ease of administration and analysis, and its reliability, validity, and level differentiation results, the researchers recommend the widespread use of this tool to understand and develop a hospital's organizational structure to support improvement activities.

  14. THE ASSESSMENT METHODOLOGIES PTELR, ADRI AND CAE – THREE METHODOLOGIES FOR COORDINATING THE EFFORTS TO IMPROVE THE ORGANIZATIONAL PROCESSES TO ACHIEVE EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Cristina Raluca POPESCU

    2015-07-01

    Full Text Available In the paper “The Assessment Methodologies PTELR, ADRI and CAE – Three Methodologies for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodologies PTELR, ADRI and CAE that are designed to coordinate the efforts to improve the organizational processes in order to achieve excellence. In the first part of the paper (the introduction of the paper, the authors present the general background concerning the performance of management business processes and the importance of achieving excellence and furthermore correctly assessing/evaluating it. Aspects such as quality, quality control, quality assurance, performance and excellence are brought into discussion in the context generated by globalization, new technologies and new business models. Moreover, aspects regarding the methods employed to ensure the quality, maintaining it and continuous improvements, as well as total quality management, are also main pillars of this current research. In the content of the paper (the assessment methodologies PTELR, ADRI and CAE – as methodologies for coordinating the efforts to improve the organizational processes to achieve excellence, the authors describe the characteristics of the assessment methodologies PTELR, ADRI and CAE from a theoretical point of view.

  15. A Study on the Improvement of the INPRO Proliferation Resistance Assessment Methodology

    International Nuclear Information System (INIS)

    Ko, Won Il; Chang, Hong Lae

    2010-07-01

    Within the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO), a methodology for evaluating proliferation resistance (INPRO PR methodology) has been developed. However, User Requirement (UR) 4 regarding multiplicity and robustness of barriers against proliferation ('innovative nuclear energy systems should incorporate multiple proliferation resistance features and measures') remains to be developed. Because the development of a methodology for evaluating User Requirement 4 requires an acquisition/diversion pathway analysis, a systematic approach was developed for the identification and analysis of pathways for the acquisition of weapons-useable nuclear material. This approach was applied to the DUPIC fuel cycle which identified several proliferation target materials and plausible acquisition/diversion pathways. Based on these results, proliferation strategies that a proliferant State could adopt for undeclared removal of nuclear material from the DUPIC fuel cycle have been developed based on the objectives of the proliferation of the State, the quality and quantity of the target material, the time required to acquire the material for the proliferation, and the technical and financial capabilities of the potential proliferant State. The diversion pathway for fresh DUPIC fuel was analyzed using the INPRO User Requirements 1, 2 and 3, and based on these results an assessment procedure and metrics for evaluating the multiplicity and robustness of proliferation barriers has been developed. In conclusion, the multiplicity and robustness of proliferation barriers is not a function of the number of barriers, or of their individual characteristics but is an integrated function of the whole. The robustness of proliferation barriers is measured by determining whether the safeguards goals can be met. The harmonization of INPRO PR methodology with the GIF PR and PP methodology was also considered. It was suggested that, as also confirmed by IAEA

  16. Radionuclide measurements, via different methodologies, as tool for geophysical studies on Mt. Etna

    Energy Technology Data Exchange (ETDEWEB)

    Morelli, D., E-mail: daniela.morelli@ct.infn.it [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Istituto Nazionale di Fisica Nucleare- Sezione di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Imme, G. [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Istituto Nazionale di Fisica Nucleare- Sezione di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Altamore, I.; Cammisa, S. [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Giammanco, S. [Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania, Piazza Roma, 2, I-95123 Catania (Italy); La Delfa, S. [Dipartimento di Scienze Geologiche, Universita di Catania, Corso Italia,57 I-95127 Catania (Italy); Mangano, G. [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Neri, M. [Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania, Piazza Roma, 2, I-95123 Catania (Italy); Patane, G. [Dipartimento di Scienze Geologiche, Universita di Catania, Corso Italia,57 I-95127 Catania (Italy)

    2011-10-01

    Natural radioactivity measurements represent an interesting tool to study geodynamical events or soil geophysical characteristics. In this direction we carried out, in the last years, several radionuclide monitoring both in the volcanic and tectonic areas of the oriental Sicily. In particular we report in-soil Radon investigations, in a tectonic area, including both laboratory and in-site measurements, applying three different methodologies, based on both active and passive detection systems. The active detection devices consisted of solid-state silicon detectors equipped in portable systems for short-time measurements and for long-time monitoring. The passive technique consisted of solid-state nuclear track detectors (SSNTD), CR-39 type, and allowed integrated measurements. The performances of the three methodologies were compared according to different kinds of monitoring. In general the results obtained with the three methodologies seem in agreement with each other and reflect the tectonic settings of the investigated area.

  17. Radionuclide measurements, via different methodologies, as tool for geophysical studies on Mt. Etna

    International Nuclear Information System (INIS)

    Morelli, D.; Imme, G.; Altamore, I.; Cammisa, S.; Giammanco, S.; La Delfa, S.; Mangano, G.; Neri, M.; Patane, G.

    2011-01-01

    Natural radioactivity measurements represent an interesting tool to study geodynamical events or soil geophysical characteristics. In this direction we carried out, in the last years, several radionuclide monitoring both in the volcanic and tectonic areas of the oriental Sicily. In particular we report in-soil Radon investigations, in a tectonic area, including both laboratory and in-site measurements, applying three different methodologies, based on both active and passive detection systems. The active detection devices consisted of solid-state silicon detectors equipped in portable systems for short-time measurements and for long-time monitoring. The passive technique consisted of solid-state nuclear track detectors (SSNTD), CR-39 type, and allowed integrated measurements. The performances of the three methodologies were compared according to different kinds of monitoring. In general the results obtained with the three methodologies seem in agreement with each other and reflect the tectonic settings of the investigated area.

  18. [Needs assessment to improve the applicability and methodological quality of a German S3 guideline].

    Science.gov (United States)

    Burckhardt, Marion; Hoffmann, Cristina; Nink-Grebe, Brigitte; Sänger, Sylvia

    2018-04-01

    pressure ulcers. They also proposed that plastic surgical procedures, several specific wound products and complementary measures should be included. The guideline is of high methodical quality with respect to the systematic synthesis and the formal expert recommendations. From both the stakeholders' and reviewers' perspectives, the guideline should be more in line with what guideline users regarded as key issues. The recommendations should be more action-oriented. Implementation concepts should be provided to teach, implement and evaluate the guideline in healthcare facilities. The updating process should also follow current standards for guideline development, for systematic reviews and for managing conflict of interests. The guideline is of high methodological quality but currently difficult to implement in clinical practice. The structured evaluation clearly reflects not only the potential for improvement but also provides a transparent theoretical framework for experts and scientific medical societies involved in the guideline updating process. Although some valuable insights were gained from the stakeholders' perspective, the representativeness is limited by the low response rate. Copyright © 2018. Published by Elsevier GmbH.

  19. Methodology of clinical measures of healthcare quality delivered to patients with cardiovascular diseases

    Directory of Open Access Journals (Sweden)

    Posnenkova O.M.

    2014-03-01

    Full Text Available The results of implementation the methodology proposed by American Colleague of Cardiology and American Heart Association (ACC/AHA for development of Russian clinical quality measures for patients with arterial hypertension, coronary heart disease and chronic heart failure. Created quality measures cover the key elements of medical care influencing directly on clinical outcomes of treatment.

  20. Quick Green Scan: A Methodology for Improving Green Performance in Terms of Manufacturing Processes

    Directory of Open Access Journals (Sweden)

    Aldona Kluczek

    2017-01-01

    Full Text Available The heating sector has begun implementing technologies and practices to tackle the environmental and social–economic problems caused by their production process. The purpose of this paper is to develop a methodology, “the Quick-Green-Scan”, that caters for the need of quick assessment decision-makers to improve green manufacturing performance in companies that produce heating devices. The study uses a structured approach that integrates Life Cycle Assessment-based indicators, framework and linguistic scales (fuzzy numbers to evaluate the extent of greening of the enterprise. The evaluation criteria and indicators are closely related to the current state of technology, which can be improved. The proposed methodology has been created to answer the question whether a company acts on the opportunity to be green and whether these actions are contributing towards greening, maintaining the status quo or moving away from a green outcome. Results show that applying the proposed improvements in processes helps move the facility towards being a green enterprise. Moreover, the methodology, being particularly quick and simple, is a practical tool for benchmarking, not only in the heating industry, but also proves useful in providing comparisons for facility performance in other manufacturing sectors.

  1. Quality improvement in neurology: AAN Parkinson disease quality measures

    Science.gov (United States)

    Cheng, E.M.; Tonn, S.; Swain-Eng, R.; Factor, S.A.; Weiner, W.J.; Bever, C.T.

    2010-01-01

    Background: Measuring the quality of health care is a fundamental step toward improving health care and is increasingly used in pay-for-performance initiatives and maintenance of certification requirements. Measure development to date has focused on primary care and common conditions such as diabetes; thus, the number of measures that apply to neurologic care is limited. The American Academy of Neurology (AAN) identified the need for neurologists to develop measures of neurologic care and to establish a process to accomplish this. Objective: To adapt and test the feasibility of a process for independent development by the AAN of measures for neurologic conditions for national measurement programs. Methods: A process that has been used nationally for measure development was adapted for use by the AAN. Topics for measure development are chosen based upon national priorities, available evidence base from a systematic literature search, gaps in care, and the potential impact for quality improvement. A panel composed of subject matter and measure development methodology experts oversees the development of the measures. Recommendation statements and their corresponding level of evidence are reviewed and considered for development into draft candidate measures. The candidate measures are refined by the expert panel during a 30-day public comment period and by review by the American Medical Association for Current Procedural Terminology (CPT) II codes. All final AAN measures are approved by the AAN Board of Directors. Results: Parkinson disease (PD) was chosen for measure development. A review of the medical literature identified 258 relevant recommendation statements. A 28-member panel approved 10 quality measures for PD that included full specifications and CPT II codes. Conclusion: The AAN has adapted a measure development process that is suitable for national measurement programs and has demonstrated its capability to independently develop quality measures. GLOSSARY

  2. The Methodology of Doppler-Derived Central Blood Flow Measurements in Newborn Infants

    Directory of Open Access Journals (Sweden)

    Koert A. de Waal

    2012-01-01

    Full Text Available Central blood flow (CBF measurements are measurements in and around the heart. It incorporates cardiac output, but also measurements of cardiac input and assessment of intra- and extracardiac shunts. CBF can be measured in the central circulation as right or left ventricular output (RVO or LVO and/or as cardiac input measured at the superior vena cava (SVC flow. Assessment of shunts incorporates evaluation of the ductus arteriosus and the foramen ovale. This paper describes the methodology of CBF measurements in newborn infants. It provides a brief overview of the evolution of Doppler ultrasound blood flow measurements, basic principles of Doppler ultrasound, and an overview of all used methodology in the literature. A general guide for interpretation and normal values with suggested cutoffs of CBFs are provided for clinical use.

  3. THE ASSESSMENT METHODOLOGY PDCA/PDSA – A METHODOLOGY FOR COORDINATING THE EFFORTS TO IMPROVE THE ORGANIZATIONAL PROCESSES TO ACHIEVE EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Cristina Raluca POPESCU

    2015-07-01

    Full Text Available In the paper “The Assessment Methodology PDCA/PDSA – A Methodology for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodology PDCA/PDSA that is designed to coordinate the efforts to improve the organizational processes in order to achieve excellence. In the first part of the paper (the introduction of the paper, the authors present the general background concerning the performance of management business processes and the importance of achieving excellence and furthermore correctly assessing/evaluating it. In the second part of the paper (the assessment methodology PDCA/PDSA – as a methodology for coordinating the efforts to improve the organizational processes to achieve excellence, the authors describe the characteristics of the assessment methodology PDCA/PDSA from a theoretical point of view. We can say that in the current state of global economy, the global performance includes the economic, social and environmental issues, while, effectiveness and efficiency acquire new dimensions, both quantitative and qualitative. Performance needs to adopt a more holistic view of the interdependence of internal and external parameters, quantitative and qualitative, technical and human, physical and financial management of, thus leading to what we call today overall performance.

  4. A general improved methodology to forecasting future oil production: Application to the UK and Norway

    International Nuclear Information System (INIS)

    Fiévet, L.; Forró, Z.; Cauwels, P.; Sornette, D.

    2015-01-01

    We present a new Monte-Carlo methodology to forecast the crude oil production of Norway and the U.K. based on a two-step process, (i) the nonlinear extrapolation of the current/past performances of individual oil fields and (ii) a stochastic model of the frequency of future oil field discoveries. Compared with the standard methodology that tends to underestimate remaining oil reserves, our method gives a better description of future oil production, as validated by our back-tests starting in 2008. Specifically, we predict remaining reserves extractable until 2030 to be 5.7 ± 0.3 billion barrels for Norway and 3.0 ± 0.3 billion barrels for the UK, which are respectively 45% and 66% above the predictions using an extrapolation of aggregate production. - Highlights: • Two step methodology to forecast a countries oil production. • Nonlinear extrapolation of the performance of individual fields. • Stochastic model of the frequency of future discoveries. • Backtest starting in 2008 of the methodology. • Improvement upon standard extrapolation of aggregate production

  5. Quality measurement and improvement in liver transplantation.

    Science.gov (United States)

    Mathur, Amit K; Talwalkar, Jayant

    2018-06-01

    There is growing interest in the quality of health care delivery in liver transplantation. Multiple stakeholders, including patients, transplant providers and their hospitals, payers, and regulatory bodies have an interest in measuring and monitoring quality in the liver transplant process, and understanding differences in quality across centres. This article aims to provide an overview of quality measurement and regulatory issues in liver transplantation performed within the United States. We review how broader definitions of health care quality should be applied to liver transplant care models. We outline the status quo including the current regulatory agencies, public reporting mechanisms, and requirements around quality assurance and performance improvement (QAPI) activities. Additionally, we further discuss unintended consequences and opportunities for growth in quality measurement. Quality measurement and the integration of quality improvement strategies into liver transplant programmes hold significant promise, but multiple challenges to successful implementation must be addressed to optimise value. Copyright © 2018 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  6. Improvement of the Assignment Methodology of the Approach Embankment Design to Highway Structures in Difficult Conditions

    Science.gov (United States)

    Chistyy, Y.; Kuzakhmetova, E.; Fazilova, Z.; Tsukanova, O.

    2018-03-01

    Design issues of junction of bridges and overhead road with approach embankment are studied. The reasons for the formation of deformations in the road structure are indicated. Activities to ensure sustainability and acceleration of the shrinkage of a weak subgrade approach embankment are listed. The necessity of taking into account the man-made impact of the approach embankment on the subgrade behavior is proved. Modern stabilizing agents to improve the properties of used soils in the embankment and the subgrade are suggested. Clarified methodology for determining an active zone of compression in the subgrade under load from the weight of the embankment is described. As an additional condition to the existing methodology for establishing the lower bound of the active zone of compression it is offered to accept the accuracy of evaluation of soil compressibility and determine shrinkage.

  7. The Researching on Evaluation of Automatic Voltage Control Based on Improved Zoning Methodology

    Science.gov (United States)

    Xiao-jun, ZHU; Ang, FU; Guang-de, DONG; Rui-miao, WANG; De-fen, ZHU

    2018-03-01

    According to the present serious phenomenon of increasing size and structure of power system, hierarchically structured automatic voltage control(AVC) has been the researching spot. In the paper, the reduced control model is built and the adaptive reduced control model is researched to improve the voltage control effect. The theories of HCSD, HCVS, SKC and FCM are introduced and the effect on coordinated voltage regulation caused by different zoning methodologies is also researched. The generic framework for evaluating performance of coordinated voltage regulation is built. Finally, the IEEE-96 stsyem is used to divide the network. The 2383-bus Polish system is built to verify that the selection of a zoning methodology affects not only the coordinated voltage regulation operation, but also its robustness to erroneous data and proposes a comprehensive generic framework for evaluating its performance. The New England 39-bus network is used to verify the adaptive reduced control models’ performance.

  8. Measurement of the porosity of amorphous materials by gamma ray transmission methodology

    International Nuclear Information System (INIS)

    Pottker, Walmir Eno; Appoloni, Carlos Roberto

    2000-01-01

    In this work it is presented the measurement of the total porosity of TRe soil, Sandstone Berea rocks and porous ceramics samples. For the determination of the total porosity, the Arquimedes method (conventional) and the gamma ray transmission methodology were employed. The porosity measurement using the gamma methodology has a significant advantage respect to the conventional method due to the fast and non-destructive determination, and also for supplying results with a greater characterization in small scales, in relation to the heterogeneity of the porosity. The conventional methodology presents good results only for homogeneous samples. The experimental set up for the gamma ray transmission technique consisted of a 241 Am source (59,53 keV ), a NaI(Tl) scintillation detector, collimators, a XYZ micrometric table and standard gamma spectrometry electronics connected to a multichannel analyser. (author)

  9. Innovative Methodologies for thermal Energy Release Measurement: case of La Solfatara volcano (Italy)

    Science.gov (United States)

    Marfe`, Barbara; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Marotta, Enrica; Peluso, Rosario

    2015-04-01

    This work is devoted to improve the knowledge on the parameters that control the heat flux anomalies associated with the diffuse degassing processes of volcanic and hydrothermal areas. The methodologies currently used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. A new method, based on the use of thermal imaging cameras, has been applied to estimate the heat flux and its time variations. This approach will allow faster heat flux measurement than already accredited methods, improving in this way the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The idea is to extrapolate the heat flux from the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. We use thermal imaging cameras, at short distances (meters to hundreds of meters), to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature. Preliminary studies have been carried out throughout the whole of the La Solfatara crater in order to investigate a possible correlation between the surface temperature and the shallow thermal gradient. We have used a FLIR SC640 thermal camera and K type thermocouples to assess the two measurements at the same time. Results suggest a good correlation between the shallow temperature gradient ΔTs and the surface temperature Ts depurated from background, and despite the campaigns took place during a period of time of a few years, this correlation seems to be stable over the time. This is an extremely motivating result for a further development of a measurement method based only on the use of small range thermal imaging camera. Surveys with thermal cameras may be manually done using a tripod to take thermal images of small contiguous areas and then joining

  10. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios

    NARCIS (Netherlands)

    Schaafstra, F J W C; van Doorn, D A; Schonewille, J T; van Riet, M M J; Hendriks, W H

    2017-01-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE)

  11. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios

    NARCIS (Netherlands)

    Schaafstra, F.J.W.C.; Doorn, van D.A.; Schonewille, J.T.; Riet, van M.M.J.; Visser, P.; Blok, M.C.; Hendriks, W.H.

    2017-01-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE)

  12. Measuring hand hygiene compliance rates in different special care settings: a comparative study of methodologies

    Directory of Open Access Journals (Sweden)

    Thyago Pereira Magnus

    2015-04-01

    Conclusions: Hand hygiene compliance was reasonably high in these units, as measured by direct observation. However, a lack of correlation with results obtained by other methodologies brings into question the validity of direct observation results, and suggests that periodic audits using other methods may be needed.

  13. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    Science.gov (United States)

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  14. The Self-Concept. Volume 1, A Review of Methodological Considerations and Measuring Instruments. Revised Edition.

    Science.gov (United States)

    Wylie, Ruth C.

    This volume of the revised edition describes and evaluates measurement methods, research designs, and procedures which have been or might appropriately be used in self-concept research. Working from the perspective that self-concept or phenomenal personality theories can be scientifically investigated, methodological flaws and questionable…

  15. Improved methodology to assess modification and completion of landfill gas management in the aftercare period

    International Nuclear Information System (INIS)

    Morris, Jeremy W.F.; Crest, Marion; Barlaz, Morton A.; Spokas, Kurt A.; Åkerman, Anna; Yuan, Lei

    2012-01-01

    Highlights: ► Performance-based evaluation of landfill gas control system. ► Analytical framework to evaluate transition from active to passive gas control. ► Focus on cover oxidation as an alternative means of passive gas control. ► Integrates research on long-term landfill behavior with practical guidance. - Abstract: Municipal solid waste landfills represent the dominant option for waste disposal in many parts of the world. While some countries have greatly reduced their reliance on landfills, there remain thousands of landfills that require aftercare. The development of cost-effective strategies for landfill aftercare is in society’s interest to protect human health and the environment and to prevent the emergence of landfills with exhausted aftercare funding. The Evaluation of Post-Closure Care (EPCC) methodology is a performance-based approach in which landfill performance is assessed in four modules including leachate, gas, groundwater, and final cover. In the methodology, the objective is to evaluate landfill performance to determine when aftercare monitoring and maintenance can be reduced or possibly eliminated. This study presents an improved gas module for the methodology. While the original version of the module focused narrowly on regulatory requirements for control of methane migration, the improved gas module also considers best available control technology for landfill gas in terms of greenhouse gas emissions, air quality, and emissions of odoriferous compounds. The improved module emphasizes the reduction or elimination of fugitive methane by considering the methane oxidation capacity of the cover system. The module also allows for the installation of biologically active covers or other features designed to enhance methane oxidation. A methane emissions model, CALMIM, was used to assist with an assessment of the methane oxidation capacity of landfill covers.

  16. Improved methodology to assess modification and completion of landfill gas management in the aftercare period

    Energy Technology Data Exchange (ETDEWEB)

    Morris, Jeremy W.F., E-mail: jmorris@geosyntec.com [Geosyntec Consultants, 10220 Old Columbia Road, Suite A, Columbia, MD 21046 (United States); Crest, Marion, E-mail: marion.crest@suez-env.com [Suez Environnement, 38 rue du President Wilson, 78230 Le Pecq (France); Barlaz, Morton A., E-mail: barlaz@ncsu.edu [Department of Civil, Construction, and Environmental Engineering, Campus Box 7908, North Carolina State University, Raleigh, NC 27695-7908 (United States); Spokas, Kurt A., E-mail: kurt.spokas@ars.usda.gov [United States Department of Agriculture - Agricultural Research Service, 1991 Upper Buford Circle, 439 Borlaug Hall, St. Paul, MN 55108 (United States); Akerman, Anna, E-mail: anna.akerman@sita.fr [SITA France, Tour CB 21, 16 Place de l' Iris, 92040 Paris La Defense Cedex (France); Yuan, Lei, E-mail: lyuan@geosyntec.com [Geosyntec Consultants, 10220 Old Columbia Road, Suite A, Columbia, MD 21046 (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Performance-based evaluation of landfill gas control system. Black-Right-Pointing-Pointer Analytical framework to evaluate transition from active to passive gas control. Black-Right-Pointing-Pointer Focus on cover oxidation as an alternative means of passive gas control. Black-Right-Pointing-Pointer Integrates research on long-term landfill behavior with practical guidance. - Abstract: Municipal solid waste landfills represent the dominant option for waste disposal in many parts of the world. While some countries have greatly reduced their reliance on landfills, there remain thousands of landfills that require aftercare. The development of cost-effective strategies for landfill aftercare is in society's interest to protect human health and the environment and to prevent the emergence of landfills with exhausted aftercare funding. The Evaluation of Post-Closure Care (EPCC) methodology is a performance-based approach in which landfill performance is assessed in four modules including leachate, gas, groundwater, and final cover. In the methodology, the objective is to evaluate landfill performance to determine when aftercare monitoring and maintenance can be reduced or possibly eliminated. This study presents an improved gas module for the methodology. While the original version of the module focused narrowly on regulatory requirements for control of methane migration, the improved gas module also considers best available control technology for landfill gas in terms of greenhouse gas emissions, air quality, and emissions of odoriferous compounds. The improved module emphasizes the reduction or elimination of fugitive methane by considering the methane oxidation capacity of the cover system. The module also allows for the installation of biologically active covers or other features designed to enhance methane oxidation. A methane emissions model, CALMIM, was used to assist with an assessment of the methane oxidation

  17. Improving respiration measurements with gas exchange analyzers.

    Science.gov (United States)

    Montero, R; Ribas-Carbó, M; Del Saz, N F; El Aou-Ouad, H; Berry, J A; Flexas, J; Bota, J

    2016-12-01

    Dark respiration measurements with open-flow gas exchange analyzers are often questioned for their low accuracy as their low values often reach the precision limit of the instrument. Respiration was measured in five species, two hypostomatous (Vitis Vinifera L. and Acanthus mollis) and three amphistomatous, one with similar amount of stomata in both sides (Eucalyptus citriodora) and two with different stomata density (Brassica oleracea and Vicia faba). CO 2 differential (ΔCO 2 ) increased two-fold with no change in apparent R d , when the two leaves with higher stomatal density faced outside. These results showed a clear effect of the position of stomata on ΔCO 2 . Therefore, it can be concluded that leaf position is important to guarantee the improvement of respiration measurements increasing ΔCO 2 without affecting the respiration results by leaf or mass units. This method will help to increase the accuracy of leaf respiration measurements using gas exchange analyzers. Copyright © 2016 Elsevier GmbH. All rights reserved.

  18. How to Measure and Interpret Quality Improvement Data.

    Science.gov (United States)

    McQuillan, Rory Francis; Silver, Samuel Adam; Harel, Ziv; Weizman, Adam; Thomas, Alison; Bell, Chaim; Chertow, Glenn M; Chan, Christopher T; Nesrallah, Gihad

    2016-05-06

    This article will demonstrate how to conduct a quality improvement project using the change idea generated in "How To Use Quality Improvement Tools in Clinical Practice: How To Diagnose Solutions to a Quality of Care Problem" by Dr. Ziv Harel and colleagues in this Moving Points feature. This change idea involves the introduction of a nurse educator into a CKD clinic with a goal of increasing rates of patients performing dialysis independently at home (home hemodialysis or peritoneal dialysis). Using this example, we will illustrate a Plan-Do-Study-Act (PDSA) cycle in action and highlight the principles of rapid cycle change methodology. We will then discuss the selection of outcome, process, and balancing measures, and the practicalities of collecting these data in the clinic environment. We will also introduce the PDSA worksheet as a practical way to oversee the progress of a quality improvement project. Finally, we will demonstrate how run charts are used to visually illustrate improvement in real time, and how this information can be used to validate achievement, respond appropriately to challenges the project may encounter, and prove the significance of results. This article aims to provide readers with a clear and practical framework upon which to trial their own ideas for quality improvement in the clinical setting. Copyright © 2016 by the American Society of Nephrology.

  19. An integrated methodology for process improvement and delivery system visualization at a multidisciplinary cancer center.

    Science.gov (United States)

    Singprasong, Rachanee; Eldabi, Tillal

    2013-01-01

    Multidisciplinary cancer centers require an integrated, collaborative, and stream-lined workflow in order to provide high quality of patient care. Due to the complex nature of cancer care and continuing changes to treatment techniques and technologies, it is a constant struggle for centers to obtain a systemic and holistic view of treatment workflow for improving the delivery systems. Project management techniques, Responsibility matrix and a swim-lane activity diagram representing sequence of activities can be combined for data collection, presentation, and evaluation of the patient care. This paper presents this integrated methodology using multidisciplinary meetings and walking the route approach for data collection, integrated responsibility matrix and swim-lane activity diagram with activity time for data representation and 5-why and gap analysis approach for data analysis. This enables collection of right detail of information in a shorter time frame by identifying process flaws and deficiencies while being independent of the nature of the patient's disease or treatment techniques. A case study of a multidisciplinary regional cancer centre is used to illustrate effectiveness of the proposed methodology and demonstrates that the methodology is simple to understand, allowing for minimal training of staff and rapid implementation. © 2011 National Association for Healthcare Quality.

  20. Covariance methodology applied to uncertainties in I-126 disintegration rate measurements

    International Nuclear Information System (INIS)

    Fonseca, K.A.; Koskinas, M.F.; Dias, M.S.

    1996-01-01

    The covariance methodology applied to uncertainties in 126 I disintegration rate measurements is described. Two different coincidence systems were used due to the complex decay scheme of this radionuclide. The parameters involved in the determination of the disintegration rate in each experimental system present correlated components. In this case, the conventional statistical methods to determine the uncertainties (law of propagation) result in wrong values for the final uncertainty. Therefore, use of the methodology of the covariance matrix is necessary. The data from both systems were combined taking into account all possible correlations between the partial uncertainties. (orig.)

  1. Methodology of ionizing radiation measurement, from x-ray equipment, for radiation protection

    International Nuclear Information System (INIS)

    Caballero, Katia C.S.; Borges, Jose C.

    1996-01-01

    Most of X-rays beam used for diagnostic, are short exposure time (milliseconds). Exception are those used in fluoroscopy. measuring instruments (area monitors with ionizing chambers or Geiger tubes) used in hospitals and clinics, in general, have characteristic answer time not adequate to X-rays beams length in time. Our objective was to analyse instruments available commercially, to prepare a measuring methodology for direct and secondary beams, in order to evaluate protection barriers for beams used in diagnostic radiology installations. (author)

  2. Measure to succeed: How to improve employee participation in continuous improvement

    Energy Technology Data Exchange (ETDEWEB)

    Jurburg, M.; Viles, E.; Tanco, M.; Mateo, R.; Lleó, A.

    2016-07-01

    Purpose: Achieving employee participation in continuous improvement (CI) systems is considered as one of the success factors for the sustainability of those systems. Yet, it is also very difficult to obtain because of the interaction of many critical factors that affect employee participation. Therefore, finding ways of measuring all these critical factors can help practitioners manage the employee participation process accordingly. Design/methodology/approach: Based upon the existing literature, this paper presents a 4-Phase (9 steps) diagnostic tool to measure the main determinants associated with the implementation of CI systems affecting employee participation in improvement activities. Findings: The tool showed its usefulness to detect the main weaknesses and improvement opportunities for improving employee participation in CI through the application in two different cases. Practical implications: This diagnostic tool could be particularly interesting for companies adopting CI and other excellence frameworks, which usually include a pillar related to people development inside the organization, but do not include tools to diagnose the state of this pillar. Originality/value: This diagnostic tool presents a user’s perspective approach, ensuring that the weaknesses and improvement opportunities detected during the diagnose come directly from the users of the CI system, which in this case are the employees themselves. Given that the final objective is to identify reasons and problems hindering employee participation, adopting this user’s perspective approach seem more relevant than adopting other more traditional approaches, based on gathering information from the CI system itself or from the CI managers.

  3. Measure to succeed: How to improve employee participation in continuous improvement

    International Nuclear Information System (INIS)

    Jurburg, M.; Viles, E.; Tanco, M.; Mateo, R.; Lleó, A.

    2016-01-01

    Purpose: Achieving employee participation in continuous improvement (CI) systems is considered as one of the success factors for the sustainability of those systems. Yet, it is also very difficult to obtain because of the interaction of many critical factors that affect employee participation. Therefore, finding ways of measuring all these critical factors can help practitioners manage the employee participation process accordingly. Design/methodology/approach: Based upon the existing literature, this paper presents a 4-Phase (9 steps) diagnostic tool to measure the main determinants associated with the implementation of CI systems affecting employee participation in improvement activities. Findings: The tool showed its usefulness to detect the main weaknesses and improvement opportunities for improving employee participation in CI through the application in two different cases. Practical implications: This diagnostic tool could be particularly interesting for companies adopting CI and other excellence frameworks, which usually include a pillar related to people development inside the organization, but do not include tools to diagnose the state of this pillar. Originality/value: This diagnostic tool presents a user’s perspective approach, ensuring that the weaknesses and improvement opportunities detected during the diagnose come directly from the users of the CI system, which in this case are the employees themselves. Given that the final objective is to identify reasons and problems hindering employee participation, adopting this user’s perspective approach seem more relevant than adopting other more traditional approaches, based on gathering information from the CI system itself or from the CI managers.

  4. Improving operating room efficiency in academic children's hospital using Lean Six Sigma methodology.

    Science.gov (United States)

    Tagge, Edward P; Thirumoorthi, Arul S; Lenart, John; Garberoglio, Carlos; Mitchell, Kenneth W

    2017-06-01

    Lean Six Sigma (LSS) is a process improvement methodology that utilizes a collaborative team effort to improve performance by systematically identifying root causes of problems. Our objective was to determine whether application of LSS could improve efficiency when applied simultaneously to all services of an academic children's hospital. In our tertiary academic medical center, a multidisciplinary committee was formed, and the entire perioperative process was mapped, using fishbone diagrams, Pareto analysis, and other process improvement tools. Results for Children's Hospital scheduled main operating room (OR) cases were analyzed, where the surgical attending followed themselves. Six hundred twelve cases were included in the seven Children's Hospital operating rooms (OR) over a 6-month period. Turnover Time (interval between patient OR departure and arrival of the subsequent patient) decreased from a median 41min in the baseline period to 32min in the intervention period (p<0.0001). Turnaround Time (interval between surgical dressing application and subsequent surgical incision) decreased from a median 81.5min in the baseline period to 71min in the intervention period (p<0.0001). These results demonstrate that a coordinated multidisciplinary process improvement redesign can significantly improve efficiency in an academic Children's Hospital without preselecting specific services, removing surgical residents, or incorporating new personnel or technology. Prospective comparative study, Level II. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. The Influence of Measurement Methodology on the Accuracy of Electrical Waveform Distortion Analysis

    Science.gov (United States)

    Bartman, Jacek; Kwiatkowski, Bogdan

    2018-04-01

    The present paper covers a review of documents that specify measurement methods of voltage waveform distortion. It also presents measurement stages of waveform components that are uncommon in the classic fundamentals of electrotechnics and signal theory, including the creation process of groups and subgroups of harmonics and interharmonics. Moreover, the paper discusses selected distortion factors of periodic waveforms and presents analyses that compare the values of these distortion indices. The measurements were carried out in the cycle per cycle mode and the measurement methodology that was used complies with the IEC 61000-4-7 norm. The studies showed significant discrepancies between the values of analyzed parameters.

  6. A methodology for supporting decisions on the establishment of protective measures after severe nuclear accidents

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Kollas, J.G.

    1994-06-01

    The objective of this report is to demonstrate the use of a methology supporting decisions on protective measures following severe nuclear accidents. A multicriteria decision analysis approach is recommended where value tradeoffs are postponed until the very last stage of the decision process. Use of efficient frontiers is made to exclude all technically inferior solutions and present the decision maker with all nondominated solutions. A choice among these solutions implies a value trade-off among the multiple criteria. An interactive computer packge has been developed where the decision maker can choose a point on the efficient frontier in the consequence space and immediately see the alternative in the decision space resulting in the chosen consequences. The methodology is demonstrated through an application on the choice among possible protective measures in contaminated areas of the former USSR after the Chernobyl accident. Two distinct cases are considered: First a decision is to be made only on the basis of the level of soil contamination with Cs-137 and the total cost of the chosen protective policy; Next the decision is based on the geographic dimension of the contamination ant the total cost. Three alternative countermeasure actions are considered for population segments living on soil contaminated at a certain level or in a specific geographic region: (a) relocation of the population; (b) improvement of the living conditions; and, (c) no countermeasures at all. This is final deliverable of the CEC-CIS Joint Study Project 2, Task 5: Decision-Aiding-System for Establishing Intervention Levels, performed under Contracts COSU-CT91-0007 and COSU-CT92-0021 with the Commission of European Communities through CEPN

  7. [Improvement in the efficiency of a rehabilitation service using Lean Healthcare methodology].

    Science.gov (United States)

    Pineda Dávila, S; Tinoco González, J

    2015-01-01

    The aim of this study was to evaluate the reduction in costs and the increase in time devoted to the patient, by applying Lean Healthcare methodology. A multidisciplinary team was formed, setting up three potential areas for improvement by performing a diagnostic process, including the storage and standardization of materials, and professional tasks in the therapeutic areas, by implementing three Lean tools: kanban, 5S and 2P. Stored material costs decreased by 43%, the cost of consumables per patient treated by 19%, and time dedicated to patient treatment increased by 7%. The processes were standardized and "muda" (wastefulness) was eliminated, thus reducing costs and increasing the value to the patient. All this demonstrates that it is possible to apply tools of industrial origin to the health sector, with the aim of improving the quality of care and achieve maximum efficiency. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  8. A call to improve sampling methodology and reporting in young novice driver research.

    Science.gov (United States)

    Scott-Parker, B; Senserrick, T

    2017-02-01

    Young drivers continue to be over-represented in road crash fatalities despite a multitude of research, communication and intervention. Evidence-based improvement depends to a great extent upon research methodology quality and its reporting, with known limitations in the peer-review process. The aim of the current research was to review the scope of research methodologies applied in 'young driver' and 'teen driver' research and their reporting in four peer-review journals in the field between January 2006 and December 2013. In total, 806 articles were identified and assessed. Reporting omissions included participant gender (11% of papers), response rates (49%), retention rates (39%) and information regarding incentives (44%). Greater breadth and specific improvements in study designs and reporting are thereby identified as a means to further advance the field. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. New evaluation methodology of regenerative braking contribution to energy efficiency improvement of electric vehicles

    International Nuclear Information System (INIS)

    Qiu, Chengqun; Wang, Guolin

    2016-01-01

    Highlights: • Two different contribution ratio evaluation parameters according to the deceleration braking process are proposed. • Methodologies for calculating the contribution made by regenerative brake to improve vehicle energy efficiency are proposed. • Road test results imply that the proposed parameters are effective. - Abstract: Comprehensive research is conducted on the design and control of a regenerative braking system for electric vehicles. The mechanism and evaluation methods of contribution brought by regenerative braking to improve electric vehicle’s energy efficiency are discussed and analyzed by the energy flow. Methodologies for calculating the contribution made by regenerative brake are proposed. Additionally a new regenerative braking control strategy called “serial 2 control strategy” is introduced. Moreover, two control strategies called “parallel control strategy” and “serial 1 control strategy” are proposed as the comparative control strategy. Furthermore, two different contribution ratio evaluation parameters according to the deceleration braking process are proposed. Finally, road tests are carried out under China typical city regenerative driving cycle standard with three different control strategies. The serial 2 control strategy offers considerably higher regeneration efficiency than the parallel strategy and serial 1 strategy.

  10. Methodology to measure strains at high temperatures using electrical strain gages with free filaments

    International Nuclear Information System (INIS)

    Atanazio Filho, Nelson N.; Gomes, Paulo T. Vida; Scaldaferri, Denis H.B.; Silva, Luiz L. da; Rabello, Emerson G.; Mansur, Tanius R.

    2013-01-01

    An experimental methodology used for strains measuring at high temperatures is show in this work. In order to do the measurements, it was used electric strain gages with loose filaments attached to a stainless steel 304 beam with specific cements. The beam has triangular shape and a constant thickness, so the strain is the same along its length. Unless the beam surface be carefully prepared, the strain gage attachment is not efficient. The showed results are for temperatures ranging from 20 deg C to 300 deg C, but the experimental methodology could be used to measure strains at a temperature up to 900 deg C. Analytical calculations based on solid mechanics were used to verify the strain gage electrical installation and the measured strains. At a first moment, beam deformations as a temperature function were plotted. After that, beam deformations with different weighs were plotted as a temperature function. The results shown allowed concluding that the experimental methodology is trustable to measure strains at temperatures up to 300 deg C. (author)

  11. High-frequency measurements of aeolian saltation flux: Field-based methodology and applications

    Science.gov (United States)

    Martin, Raleigh L.; Kok, Jasper F.; Hugenholtz, Chris H.; Barchyn, Thomas E.; Chamecki, Marcelo; Ellis, Jean T.

    2018-02-01

    Aeolian transport of sand and dust is driven by turbulent winds that fluctuate over a broad range of temporal and spatial scales. However, commonly used aeolian transport models do not explicitly account for such fluctuations, likely contributing to substantial discrepancies between models and measurements. Underlying this problem is the absence of accurate sand flux measurements at the short time scales at which wind speed fluctuates. Here, we draw on extensive field measurements of aeolian saltation to develop a methodology for generating high-frequency (up to 25 Hz) time series of total (vertically-integrated) saltation flux, namely by calibrating high-frequency (HF) particle counts to low-frequency (LF) flux measurements. The methodology follows four steps: (1) fit exponential curves to vertical profiles of saltation flux from LF saltation traps, (2) determine empirical calibration factors through comparison of LF exponential fits to HF number counts over concurrent time intervals, (3) apply these calibration factors to subsamples of the saltation count time series to obtain HF height-specific saltation fluxes, and (4) aggregate the calibrated HF height-specific saltation fluxes into estimates of total saltation fluxes. When coupled to high-frequency measurements of wind velocity, this methodology offers new opportunities for understanding how aeolian saltation dynamics respond to variability in driving winds over time scales from tens of milliseconds to days.

  12. Improved b lifetime measurement from MAC

    International Nuclear Information System (INIS)

    Ford, W.T.

    1984-03-01

    Two recent publications, from the MAC and Mark II collaborations, have reported the somewhat surprising result that the lifetime of particles made up of b quarks is in the 1 to 2 picosecond range, or somewhat longer than the lifetimes of charm particles. Although the charm decays are favored transitions while those of b particles depend upon off-diagonal elements of the weak flavor mixing matrix, the smallness of the b decay rates in face of the large available phase space indicates that the off-diagonal elements are indeed very small. The possibility for complete determination of the mixing matrix was brought significantly nearer by the availability of the lifetime information; what is needed now is to reduce the uncertainty of the measurements, which was about 33% for both experiments. We describe here an extension of the b lifetime study with the MAC detector, incorporating some new data and improvements in the analysis. 12 references

  13. Measures to improve nuclear power project management

    International Nuclear Information System (INIS)

    Ma Xinchao

    2012-01-01

    Focusing on correct application of ability level principle in setting organizational structure, the effective management system has been established, and 8 practical management regimes have been developed. Personnel training and management work shall be well done and enhanced. Experience feedback in construction management shall be well done for all systems. Exchange of construction and management techniques shall be actively carried out. All staff shall participate in safety management. KPI system is adopted for assessing stakeholders' project management method, and PDCA cycle is adopted for continued improved. Management level upgrading measures are proposed to ensure the smooth construction of nuclear power project. Setting forth and popularizing management theory can provide reference for and promote the smooth progress of various nuclear power projects. (author)

  14. The Relationship between Quality Measurement and Efficiency Improvement in Health Care Systems

    OpenAIRE

    Gilbert Roland; Dr. Jane Marry Gill

    2017-01-01

    Quality measurement in health care organisation is most often considered as measures for cost-saving and error reduction in the clinical procedures. The concept of quality measurement in health care organisations is the analysis of effectiveness and accuracy in procedures for patients’ diagnosis and treatment. This study aimed to find the relationship between quality measurement and efficiency improvements in the healthcare sector of Mauritius. This was executed by using mixed methodological ...

  15. Reducing DNACPR complaints to zero: designing and implementing a treatment escalation plan using quality improvement methodology.

    Science.gov (United States)

    Shermon, Elizabeth; Munglani, Laura; Oram, Sarah; William, Linda; Abel, Julian

    2017-01-01

    Do Not Attempt Resuscitation (DNAR)decisions have traditionally formed the basis of ceiling of care discussions. However, poor quality discussions can lead to high patient and relative dissatisfaction, generating hospital complaints. Treatment escalation plans (TEPs) aim to highlight the wider remit of treatment options with a focus on effective communication. We aimed to improve TEP discussions and documentation at Weston General Hospital by introducing a standardised form. We aimed to develop a TEP document to reduce resuscitation-related complaints by improving communication and documentation. Qualitative and quantitative data were collected over 2 years and used to develop plan-do-study-act (PDSA) cycles using quality improvement methodology. Main barriers to improvement included time constraints and clinician's resistance. Analysis of patient liaison services data showed a progressive reduction in complaints regarding resuscitation, with no complaints having been received for the final six months of the project. Through use of a standardised form including treatment prompts, the quality of discussions and plans improved. Qualitative feedback demonstrated increased patient and relative satisfaction. In addition, junior doctors report the plans are helpful when making out-of-hours decisions. Development of a user-friendly form to document patient-guided TEPs helped junior doctors to lead advanced care planning discussions. The use of PDSA cycles demonstrated improvement in the quality of forms, which in turn improved communication, documentation and satisfaction. Future developments could include involvement of specialist teams to ensure TEP forms remain relevant to all clinical areas. In addition, with widespread use of the TEP forms, the traditional tick-box DNAR could be replaced to focus on patient-led care planning.

  16. Improving Efficiency Using Time-Driven Activity-Based Costing Methodology.

    Science.gov (United States)

    Tibor, Laura C; Schultz, Stacy R; Menaker, Ronald; Weber, Bradley D; Ness, Jay; Smith, Paula; Young, Phillip M

    2017-03-01

    The aim of this study was to increase efficiency in MR enterography using a time-driven activity-based costing methodology. In February 2015, a multidisciplinary team was formed to identify the personnel, equipment, space, and supply costs of providing outpatient MR enterography. The team mapped the current state, completed observations, performed timings, and calculated costs associated with each element of the process. The team used Pareto charts to understand the highest cost and most time-consuming activities, brainstormed opportunities, and assessed impact. Plan-do-study-act cycles were developed to test the changes, and run charts were used to monitor progress. The process changes consisted of revising the workflow associated with the preparation and administration of glucagon, with completed implementation in November 2015. The time-driven activity-based costing methodology allowed the radiology department to develop a process to more accurately identify the costs of providing MR enterography. The primary process modification was reassigning responsibility for the administration of glucagon from nurses to technologists. After implementation, the improvements demonstrated success by reducing non-value-added steps and cost by 13%, staff time by 16%, and patient process time by 17%. The saved process time was used to augment existing examination time slots to more accurately accommodate the entire enterographic examination. Anecdotal comments were captured to validate improved staff satisfaction within the multidisciplinary team. This process provided a successful outcome to address daily workflow frustrations that could not previously be improved. A multidisciplinary team was necessary to achieve success, in addition to the use of a structured problem-solving approach. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  17. Utilizing Lean Six Sigma Methodology to Improve the Authored Works Command Approval Process at Naval Medical Center San Diego.

    Science.gov (United States)

    Valdez, Michelle M; Liwanag, Maureen; Mount, Charles; Rodriguez, Rechell; Avalos-Reyes, Elisea; Smith, Andrew; Collette, David; Starsiak, Michael; Green, Richard

    2018-03-14

    Inefficiencies in the command approval process for publications and/or presentations negatively impact DoD Graduate Medical Education (GME) residency programs' ability to meet ACGME scholarly activity requirements. A preliminary review of the authored works approval process at Naval Medical Center San Diego (NMCSD) disclosed significant inefficiency, variation in process, and a low level of customer satisfaction. In order to facilitate and encourage scholarly activity at NMCSD, and meet ACGME requirements, the Executive Steering Council (ESC) chartered an interprofessional team to lead a Lean Six Sigma (LSS) Rapid Improvement Event (RIE) project. Two major outcome metrics were identified: (1) the number of authored works submissions containing all required signatures and (2) customer satisfaction with the authored works process. Primary metric baseline data were gathered utilizing a Clinical Investigations database tracking publications and presentations. Secondary metric baseline data were collected via a customer satisfaction survey to GME faculty and residents. The project team analyzed pre-survey data and utilized LSS tools and methodology including a "gemba" (environment) walk, cause and effect diagram, critical to quality tree, voice of the customer, "muda" (waste) chart, and a pre- and post-event value stream map. The team selected an electronic submission system as the intervention most likely to positively impact the RIE project outcome measures. The number of authored works compliant with all required signatures improved from 52% to 100%. Customer satisfaction rated as "completely or mostly satisfied" improved from 24% to 97%. For both outcomes, signature compliance and customer satisfaction, statistical significance was achieved with a p methodology and tools to improve signature compliance and increase customer satisfaction with the authored works approval process, leading to 100% signature compliance, a comprehensive longitudinal repository of all

  18. Time-to-event methodology improved statistical evaluation in register-based health services research.

    Science.gov (United States)

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Methodologies for measuring travelers' risk perception of infectious diseases: A systematic review.

    Science.gov (United States)

    Sridhar, Shruti; Régner, Isabelle; Brouqui, Philippe; Gautret, Philippe

    2016-01-01

    Numerous studies in the past have stressed the importance of travelers' psychology and perception in the implementation of preventive measures. The aim of this systematic review was to identify the methodologies used in studies reporting on travelers' risk perception of infectious diseases. A systematic search for relevant literature was conducted according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. There were 39 studies identified. In 35 of 39 studies, the methodology used was that of a knowledge, attitude and practice (KAP) survey based on questionnaires. One study used a combination of questionnaires and a visual psychometric measuring instrument called the 'pictorial representation of illness and self-measurement" or PRISM. One study used a self-representation model (SRM) method. Two studies measured psychosocial factors. Valuable information was obtained from KAP surveys showing an overall lack of knowledge among travelers about the most frequent travel-associated infections and associated preventive measures. This methodological approach however, is mainly descriptive, addressing knowledge, attitudes, and practices separately and lacking an examination of the interrelationships between these three components. Another limitation of the KAP method is underestimating psychosocial variables that have proved influential in health related behaviors, including perceived benefits and costs of preventive measures, perceived social pressure, perceived personal control, unrealistic optimism and risk propensity. Future risk perception studies in travel medicine should consider psychosocial variables with inferential and multivariate statistical analyses. The use of implicit measurements of attitudes could also provide new insights in the field of travelers' risk perception of travel-associated infectious diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Methodology for the assessment of measuring uncertainties of articulated arm coordinate measuring machines

    International Nuclear Information System (INIS)

    Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Fontaine, Jean François; Coquet, Richard

    2014-01-01

    Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed. (paper)

  1. Atmospheric aerosol in an urban area: Comparison of measurement instruments and methodologies and pulmonary deposition assessment

    International Nuclear Information System (INIS)

    Berico, M.; Luciani, A.; Formignani, M.

    1996-07-01

    In March 1995 a measurement campaign of atmospheric aerosol in the Bologna urban area (Italy) was carried out. A transportable laboratory, set up by ENEA (Italian national Agency for New Technologies, Energy and the Environment) Environmental Department (Bologna), was utilized with instruments for measurement of atmospheric aerosol and meteorological parameters. The aim of this campaign was of dual purpose: to characterize aerosol in urban area and to compare different instruments and methodologies of measurements. Mass concentrations measurements, evaluated on a 23-hour period with total filter, PM10 dichotomous sampler and low pressure impactor (LPI Berner), have provided information respectively about total suspended particles, respirable fraction and granulometric parameters of aerosol. Eight meteorologic parameters, number concentration of submicromic fraction of aerosol and mass concentration of micromic fraction have been continually measured. Then, in a daytime period, several number granulometries of atmospheric aerosol have also been estimated by means of diffusion battery system. Results related to different measurement methodologies and granulometric characteristics of aerosol are presented here. Pulmonary deposition of atmospheric aerosol is finally calculated, using granulometries provided by LPI Brener and ICRP 66 human respiratory tract model

  2. Software process improvement, quality assurance and measurement

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Balla, K.; Kontogiannis, K.; Zou, Y.; Di Penta, M.

    2006-01-01

    The aim of this workshop was to present and discuss emergent software quality improvement approaches, with an emphasis on practical applications. Different views on the improvement of software processes, software products, and their interrelations, have been addressed during the workshop.

  3. Improving patient care in cardiac surgery using Toyota production system based methodology.

    Science.gov (United States)

    Culig, Michael H; Kunkle, Richard F; Frndak, Diane C; Grunden, Naida; Maher, Thomas D; Magovern, George J

    2011-02-01

    A new cardiac surgery program was developed in a community hospital setting using the operational excellence (OE) method, which is based on the principles of the Toyota production system. The initial results of the first 409 heart operations, performed over the 28 months between March 1, 2008, and June 30, 2010, are presented. Operational excellence methodology was taught to the cardiac surgery team. Coaching started 2 months before the opening of the program and continued for 24 months. Of the 409 cases presented, 253 were isolated coronary artery bypass graft operations. One operative death occurred. According to the database maintained by The Society of Thoracic Surgeons, the risk-adjusted operative mortality rate was 61% lower than the regional rate. Likewise, the risk-adjusted rate of major complications was 57% lower than The Society of Thoracic Surgeons regional rate. Daily solution to determine cause was attempted on 923 distinct perioperative problems by all team members. Using the cost of complications as described by Speir and coworkers, avoiding predicted complications resulted in a savings of at least $884,900 as compared with the regional average. By the systematic use of a real time, highly formatted problem-solving methodology, processes of care improved daily. Using carefully disciplined teamwork, reliable implementation of evidence-based protocols was realized by empowering the front line to make improvements. Low rates of complications were observed, and a cost savings of $3,497 per each case of isolated coronary artery bypass graft was realized. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  4. The improving of methodological principles of enterprise competitiveness management under the crisis

    Directory of Open Access Journals (Sweden)

    Marina Dyadyuk

    2016-12-01

    Full Text Available The purpose of this research is methodological bases improving and forming of practical tools for enterprise competitiveness management under the crisis. The specific features of the competitive environment of enterprises in Ukraine under the global and national crisis are researched in the article. From this it is concluded that any enterprise must have a greater degree of flexibility than in periods of stability or economic growth for obtaining and maintaining of competitive advantages in the current period of global instability. Flexibility and adaptability of the economic system is the main prerequisite for obtaining and developing of enterprise competitive advantages and stem component of competitiveness. We identified and characterized the methodological components of adaptive management process on the base of systematic approach and with taking into account views of scientists. The obtained scientific results are the basis for conceptual model of integrated system of enterprise adaptive management in terms of dynamic and uncertainty environment. We propose to implement this kind of control on three levels: strategic (preventive management, functionality (tactical management and operational (symptomatic management on the base of analyzing economically grounded views and existing positions. It all together will ensure effective adaptation at the macroeconomic, meso and micro levels of management. The main purpose of the proposed integrated management system is ensuring the stability and integrity of enterprises activity in terms of variability and uncertainty of the environment. The implementation of such management system provides the enterprise with certain competitive advantages. It will allow to Ukrainian enterprises maintaining the competitive position in unfavorable external conditions, but also maintaining and improving the competitiveness.

  5. A RISK BASED METHODOLOGY TO ASSESS THE ENERGY EFFICIENCY IMPROVEMENTS IN TRADITIONALLY CONSTRUCTED BUILDINGS

    Directory of Open Access Journals (Sweden)

    D. Herrera

    2013-07-01

    Full Text Available In order to achieve the CO2 reduction targets set by the Scottish government, it will be necessary to improve the energy efficiency of existing buildings. Within the total Scottish building stock, historic and traditionally constructed buildings are an important proportion, in the order of 19 % (Curtis, 2010, and represent cultural, emotional and identity values that should be protected. However, retrofit interventions could be a complex operation because of the several aspects that are involved in the hygrothermal performance of traditional buildings. Moreover, all these factors interact with each other and therefore need to be analysed as a whole. Upgrading the envelope of traditional buildings may produce severe changes to the moisture migration leading to superficial or interstitial condensation and thus fabric decay and mould growth. Retrofit projects carried out in the past have failed because of the misunderstanding, or the lack of expert prediction, of the potential consequences associated to the envelope's alteration. The evaluation of potential risks, prior to any alteration on building's physics in order to improve its energy efficiency, is critical to avoid future damage on the wall's performance or occupants' health and well being. The aim of this PhD research project is to point out the most critical aspects related to the energy efficiency improvement of traditional buildings and to develop a risk based methodology that helps owners and practitioners during the decision making process.

  6. Optimization of a novel improver gel formulation for Barbari flat bread using response surface methodology.

    Science.gov (United States)

    Pourfarzad, Amir; Haddad Khodaparast, Mohammad Hossein; Karimi, Mehdi; Mortazavi, Seyed Ali

    2014-10-01

    Nowadays, the use of bread improvers has become an essential part of improving the production methods and quality of bakery products. In the present study, the Response Surface Methodology (RSM) was used to determine the optimum improver gel formulation which gave the best quality, shelf life, sensory and image properties for Barbari flat bread. Sodium stearoyl-2-lactylate (SSL), diacetyl tartaric acid esters of monoglyceride (DATEM) and propylene glycol (PG) were constituents of the gel and considered in this study. A second-order polynomial model was fitted to each response and the regression coefficients were determined using least square method. The optimum gel formulation was found to be 0.49 % of SSL, 0.36 % of DATEM and 0.5 % of PG when desirability function method was applied. There was a good agreement between the experimental data and their predicted counterparts. Results showed that the RSM, image processing and texture analysis are useful tools to investigate, approximate and predict a large number of bread properties.

  7. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    Science.gov (United States)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population

  8. Clearance Prediction Methodology Needs Fundamental Improvement: Trends Common to Rat and Human Hepatocytes/Microsomes and Implications for Experimental Methodology.

    Science.gov (United States)

    Wood, F L; Houston, J B; Hallifax, D

    2017-11-01

    Although prediction of clearance using hepatocytes and liver microsomes has long played a decisive role in drug discovery, it is widely acknowledged that reliably accurate prediction is not yet achievable despite the predominance of hepatically cleared drugs. Physiologically mechanistic methodology tends to underpredict clearance by several fold, and empirical correction of this bias is confounded by imprecision across drugs. Understanding the causes of prediction uncertainty has been slow, possibly reflecting poor resolution of variables associated with donor source and experimental methods, particularly for the human situation. It has been reported that among published human hepatocyte predictions there was a tendency for underprediction to increase with increasing in vivo intrinsic clearance, suggesting an inherent limitation using this particular system. This implied an artifactual rate limitation in vitro, although preparative effects on cell stability and performance were not yet resolved from assay design limitations. Here, to resolve these issues further, we present an up-to-date and comprehensive examination of predictions from published rat as well as human studies (where n = 128 and 101 hepatocytes and n = 71 and 83 microsomes, respectively) to assess system performance more independently. We report a clear trend of increasing underprediction with increasing in vivo intrinsic clearance, which is similar both between species and between in vitro systems. Hence, prior concerns arising specifically from human in vitro systems may be unfounded and the focus of investigation in the future should be to minimize the potential in vitro assay limitations common to whole cells and subcellular fractions. Copyright © 2017 by The American Society for Pharmacology and Experimental Therapeutics.

  9. Performance evaluation of CT measurements made on step gauges using statistical methodologies

    DEFF Research Database (Denmark)

    Angel, J.; De Chiffre, L.; Kruth, J.P.

    2015-01-01

    In this paper, a study is presented in which statistical methodologies were applied to evaluate the measurement of step gauges on an X-ray computed tomography (CT) system. In particular, the effects of step gauge material density and orientation were investigated. The step gauges consist of uni......- and bidirectional lengths. By confirming the repeatability of measurements made on the test system, the number of required scans in the design of experiment (DOE) was reduced. The statistical model was checked using model adequacy principles; model adequacy checking is an important step in validating...

  10. A Methodology to Measure Synergy Among Energy-Efficiency Programs at the Program Participant Level

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E.

    2003-11-14

    This paper presents a methodology designed to measure synergy among energy-efficiency programs at the program participant level (e.g., households, firms). Three different definitions of synergy are provided: strong, moderate, and weak. Data to measure synergy can be collected through simple survey questions. Straightforward mathematical techniques can be used to estimate the three types of synergy and explore relative synergistic impacts of different subsets of programs. Empirical research is needed to test the concepts and methods and to establish quantitative expectations about synergistic relationships among programs. The market for new energy-efficient motors is the context used to illustrate all the concepts and methods in this paper.

  11. The professional methodological teaching performance of the professor of Physical education. Set of parameters for its measurement

    Directory of Open Access Journals (Sweden)

    Orlando Pedro Suárez Pérez

    2017-07-01

    Full Text Available This work was developed due to the need to attend to the difficulties found in the Physical Education teachers of the municipality of San Juan and Martínez during the development of the teaching-learning process of Basketball, which threaten the quality of the classes, sports results and preparation of the School for life. The objective is to propose parameters that allow measuring the professional teaching methodological performance of these teachers. The customized behavior of the research made possible the diagnosis of the 26 professors taken as a sample, expressing the traits that distinguish their efficiency, determining their potentialities and deficiencies. During the research process, theoretical, empirical and statistical methods were used, which permitted to corroborate the real existence of the problem, as well as the evaluation of its impact, which revealed a positive transformation in pedagogical practice. The results provide a concrete and viable answer for the improvement of the evaluation of the teaching-methodological component of the Physical Education teacher, which constitutes an important material of guidance for methodologists and managers related to the instrumental cognitive, procedural and attitudinal performance , In order to conduct from the precedent knowledge, the new knowledge and lead to a formative process, with a contemporary vision, offering methodological resources to control the quality of Physical Education lessons.

  12. Improvements for Optics Measurement and Corrections software

    CERN Document Server

    Bach, T

    2013-01-01

    This note presents the improvements for the OMC software during a 14 month technical student internship at CERN. The goal of the work was to improve existing software in terms of maintainability, features and performance. Significant improvements in stability, speed and overall development process were reached. The main software, a Java GUI at the LHC CCC, run for months without noteworthy problems. The overall running time of the software chain used for optics corrections was reduced from nearly half an hour to around two minutes. This was the result of analysing and improving several involved programs and algorithms.

  13. Application of Lean Healthcare methodology in a urology department of a tertiary hospital as a tool for improving efficiency.

    Science.gov (United States)

    Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D

    To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. Methodology of heat transfer and flow resistance measurement for matrices of rotating regenerative heat exchangers

    Directory of Open Access Journals (Sweden)

    Butrymowicz Dariusz

    2016-09-01

    Full Text Available The theoretical basis for the indirect measurement approach of mean heat transfer coefficient for the packed bed based on the modified single blow technique was presented and discussed in the paper. The methodology of this measurement approach dedicated to the matrix of the rotating regenerative gas heater was discussed in detail. The testing stand consisted of a dedicated experimental tunnel with auxiliary equipment and a measurement system are presented. Selected experimental results are presented and discussed for selected types of matrices of regenerative air preheaters for the wide range of Reynolds number of gas. The agreement between the theoretically predicted and measured temperature profiles was demonstrated. The exemplary dimensionless relationships between Colburn heat transfer factor, Darcy flow resistance factor and Reynolds number were presented for the investigated matrices of the regenerative gas heater.

  15. Radioactivity measurement of the liquid effluents of two university hospital methodology, problems arising

    International Nuclear Information System (INIS)

    Basse-Cathalinat, B.; Barthe, N.; Chatti, K.; Ducassou, D.

    2005-01-01

    The authors present methodology used to measure the radioactivity of the effluents at the output of two services of Nuclear medicine located in two Hospital complexes of the Area of Bordeaux. These measures are intended to answer at the requests of circular DGS/DHOS no 2001/323 of the Ministry for Employment and Solidarity. The selected method is more powerful since it is based on the use of a whole of spectrometry to very low background noise. These devices of measurements make it possible to take into account all the isotopes coming from a service of Nuclear medicine. The authors are conscious that of such measurements cannot be considered in all the services of Nuclear medicine. Other technical articles will specify simpler methods allowing a satisfactory management of the radioactive wastes. (author)

  16. Measurements of integrated components' parameters versus irradiation doses gamma radiation (60Co) dosimetry-methodology-tests

    International Nuclear Information System (INIS)

    Fuan, J.

    1991-01-01

    This paper describes the methodology used for the irradiation of the integrated components and the measurements of their parameters, using Quality Insurance of dosimetry: - Measurement of the integrated dose using the competences of the Laboratoire Central des Industries Electriques (LCIE): - Measurement of irradiation dose versus source/component distance, using a calibrated equipment. - Use of ALANINE dosimeters, placed on the support of the irradiated components. - Assembly and polarization of components during the irradiations. Selection of the irradiator. - Measurement of the irradiated components's parameters, using the competences of the societies: - GenRad: GR130 tests equipement placed in the DEIN/SIR-CEN SACLAY. - Laboratoire Central des Industries Electriques (LCIE): GR125 tests equipment and this associated programmes test [fr

  17. Can formal collaborative methodologies improve quality in primary health care in New Zealand? Insights from the EQUIPPED Auckland Collaborative.

    Science.gov (United States)

    Palmer, Celia; Bycroft, Janine; Healey, Kate; Field, Adrian; Ghafel, Mazin

    2012-12-01

    Auckland District Health Board was one of four District Health Boards to trial the Breakthrough Series (BTS) methodology to improve the management of long-term conditions in New Zealand, with support from the Ministry of Health. To improve clinical outcomes, facilitate planned care and promote quality improvement within participating practices in Auckland. Implementation of the Collaborative followed the improvement model / Institute for Healthcare Improvement methodology. Three topic areas were selected: system redesign, cardio-vascular disease/diabetes, and self-management support. An expert advisory group and the Improvement Foundation Australia helped guide project development and implementation. Primary Health Organisation facilitators were trained in the methodology and 15 practice teams participated in the three learning workshops and action periods over 12 months. An independent evaluation study using both quantitative and qualitative methods was conducted. Improvements were recorded in cardiovascular disease risk assessment, practice-level systems of care, self-management systems and follow-up and coordination for patients. Qualitative research found improvements in coordination and teamwork, knowledge of practice populations and understanding of managing long-term conditions. The Collaborative process delivered some real improvements in the systems of care for people with long-term conditions and a change in culture among participating practices. The findings suggest that by strengthening facilitation processes, improving access to comprehensive population audit tools and lengthening the time frame, the process has the potential to make significant improvements in practice. Other organisations should consider this approach when investigating quality improvement programmes.

  18. A Methodology for Measuring Microplastic Transport in Large or Medium Rivers

    Directory of Open Access Journals (Sweden)

    Marcel Liedermann

    2018-04-01

    Full Text Available Plastic waste as a persistent contaminant of our environment is a matter of increasing concern due to the largely unknown long-term effects on biota. Although freshwater systems are known to be the transport paths of plastic debris to the ocean, most research has been focused on marine environments. In recent years, freshwater studies have advanced rapidly, but they rarely address the spatial distribution of plastic debris in the water column. A methodology for measuring microplastic transport at various depths that is applicable to medium and large rivers is needed. We present a new methodology offering the possibility of measuring microplastic transport at different depths of verticals that are distributed within a profile. The net-based device is robust and can be applied at high flow velocities and discharges. Nets with different sizes (41 µm, 250 µm, and 500 µm are exposed in three different depths of the water column. The methodology was tested in the Austrian Danube River, showing a high heterogeneity of microplastic concentrations within one cross section. Due to turbulent mixing, the different densities of the polymers, aggregation, and the growth of biofilms, plastic transport cannot be limited to the surface layer of a river, and must be examined within the whole water column as for suspended sediments. These results imply that multipoint measurements are required for obtaining the spatial distribution of plastic concentration and are therefore a prerequisite for calculating the passing transport. The analysis of filtration efficiency and side-by-side measurements with different mesh sizes showed that 500 µm nets led to optimal results.

  19. Using lean methodology to improve efficiency of electronic order set maintenance in the hospital.

    Science.gov (United States)

    Idemoto, Lori; Williams, Barbara; Blackmore, Craig

    2016-01-01

    Order sets, a series of orders focused around a diagnosis, condition, or treatment, can reinforce best practice, help eliminate outdated practice, and provide clinical guidance. However, order sets require regular updates as evidence and care processes change. We undertook a quality improvement intervention applying lean methodology to create a systematic process for order set review and maintenance. Root cause analysis revealed challenges with unclear prioritization of requests, lack of coordination between teams, and lack of communication between producers and requestors of order sets. In March of 2014, we implemented a systematic, cyclical order set review process, with a set schedule, defined responsibilities for various stakeholders, formal meetings and communication between stakeholders, and transparency of the process. We first identified and deactivated 89 order sets which were infrequently used. Between March and August 2014, 142 order sets went through the new review process. Processing time for the build duration of order sets decreased from a mean of 79.6 to 43.2 days (pLean production principles to the order set review process resulted in significant improvement in processing time and increased quality of orders. As use of order sets and other forms of clinical decision support increase, regular evidence and process updates become more critical.

  20. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej; Pereší ni, Peter; Kostić, Dejan; Canini, Marco

    2018-01-01

    and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a

  1. THE ARTHRITIS AND MUSCULOSKELETAL QUALITY IMPROVEMENT PROGRAM (AMQUIP: A BREAKTHROUGH SERIES METHODOLOGY PROJECT

    Directory of Open Access Journals (Sweden)

    MASTURA I

    2008-01-01

    Full Text Available The Australian government had funded the National Primary Care Collaborative (NPCC program with funding of $14.6 million over three years. One of the pilots project was the Arthritis and Musculoskeletal Quality Improvement Program (AMQuIP.The study aims to optimize general practitioners (GPs management of patients with osteoarthritis (OA of the hip and knee by identifying gaps between their current practice and best practice. The Breakthrough Series Collaborative methodology with several Plan-Do-Study-Act (PDSA cycles was employed. Participants comprises of 12 GPs/practices from two Victorian Divisions of general Practice (one rural, one metropolitan with 10 patients per GP/practice. GPs/practices attended an orientation and three learning workshops and a videoconference. GPs/practices completed PDSA cycles between workshop and reported results at workshops. GPs/practices reported use of guidelines, change in patient management and change in practice management/systems. All recruited patients completed the SF-12v2 Health Survey and WOMAC OA Index Questionnaire twice. Follow up activities including focus groups and face-to-face interviews were held six months after the final workshop. All GPs/practices used the guidelines/key messages, introduced “new” management strategies to patients, and made positive changes to their practice management/systems. Patient reported positive changes and outcomes. By using a structured methodology and evidence-based guidelines/key messages; GPs can introduce new patient management strategies, and by identifying gaps in practice management systems, positive changes can be achieved.

  2. Improvement in decay ratio calculation in LAPUR5 methodology for BWR instability

    International Nuclear Information System (INIS)

    Li Hsuannien; Yang Tzungshiue; Shih Chunkuan; Wang Jongrong; Lin Haotzu

    2009-01-01

    LAPUR5, based on frequency domain approach, is a computer code that analyzes the core stability and calculates decay ratios (DRs) of boiling water nuclear reactors. In current methodology, one set of parameters (three friction multipliers and one density reactivity coefficient multiplier) is chosen for LAPUR5 input files, LAPURX and LAPURW. The calculation stops and DR for this particular set of parameters is obtained when the convergence criteria (pressure, mass flow rate) are first met. However, there are other sets of parameters which could also meet the same convergence criteria without being identified. In order to cover these ranges of parameters, we developed an improved procedure to calculate DR in LAPUR5. First, we define the ranges and increments of those dominant input parameters in the input files for DR loop search. After LAPUR5 program execution, we can obtain all DRs for every set of parameters which satisfy the converge criteria in one single operation. The part for loop search procedure covers those steps in preparing LAPURX and LAPURW input files. As a demonstration, we looked into the reload design of Kuosheng Unit 2 Cycle 22. We found that the global DR has a maximum at exposure of 9070 MWd/t and the regional DR has a maximum at exposure of 5770 MWd/t. It should be noted that the regional DR turns out to be larger than the global ones for exposures less than 5770 MWd/t. Furthermore, we see that either global or regional DR by the loop search method is greater than the corresponding values from our previous approach. It is concluded that the loop search method can reduce human error and save human labor as compared with the previous version of LAPUR5 methodology. Now the maximum DR can be effectively obtained for a given plant operating conditions and a more precise stability boundary, with less uncertainty, can be plotted on plant power/flow map. (author)

  3. Improvement The Acquisition of Research Methodology and Self Regulated Learning through Blog Project

    Directory of Open Access Journals (Sweden)

    Heri Retnawati

    2017-06-01

    Full Text Available Abstract: This classroom action research seeks to improveself-regulated learning (SRL and understanding of research methodology at the graduate school. Nineteen graduate school students were involved. Using project-based learning (PjBL, students were assigned to create online blogs as the main project. The blog was intended for representing their understanding of research methodology by writing review of research articles and submitting a research proposal. The classroom action research was based ona model by Kemmis & McTaggart and was conducted in two cycles. The data were analyzed using mixed methods in which the main data were analyzed qualitatively and further analysed quantitatively. The results of the study showed that after completing the course, students not only gained knowledge about research methods, but were also able to write are search proposal. In addition, the project-based learning could facilitate students to practice their communication skills while writing on their blog and to improve selfegulated learning. Keywords: Action research, project-based learning, blog, self-regulated learning PENINGKATAN PENGUASAAN METODOLOGI PENELITIAN DAN SELF REGULATED LEARNING MELALUI PROJEK BLOG Abstrak: Penelitian tindakan kelas ini bertujuan untuk meningkatkan kemandirian belajar dan pemahaman metodologi penelitian di sekolah Pascasarjana. Partisipan yang terlibat pada studi ini adalah 19 mahasiswa master di sekolah pascasarjana. Dengan menerapkan pembelajaran berbasis projek (PjBL, mahasiswa diberi tugas membuat blog sebagai projek utama. Projek yang dibuat mahasiswa berupa blog untuk merepresantasikan pemahaman metodologi penelitian mahasiswa melalui tulisan dan usulan penelitian tesis. Penelitian tindakan ini dilaksanakan dalam dua siklus dengan model Kemmis & Taggart. Analisis data dilakukan dengan mixed methods secara kualitatif dengan dilengkapi analisis kuantitatif sebagai pendukung. Hasil studi menunjukkan bahwa setelah menyelesaikan

  4. Agile Methodologies and Software Process Improvement Maturity Models, Current State of Practice in Small and Medium Enterprises

    OpenAIRE

    Koutsoumpos, Vasileios; Marinelarena, Iker

    2013-01-01

    Abstract—Background: Software Process Improvement (SPI) maturity models have been developed to assist organizations to enhance software quality. Agile methodologies are used to ensure productivity and quality of a software product. Amongst others they are applied in Small and Medium – sized Enterprises (SMEs). However, little is known about the combination of Agile methodologies and SPI maturity models regarding SMEs and the results that could emerge, as all the current SPI models are address...

  5. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  6. A general centroid determination methodology, with application to multilayer dielectric structures and thermally stimulated current measurements

    International Nuclear Information System (INIS)

    Miller, S.L.; Fleetwood, D.M.; McWhorter, P.J.; Reber, R.A. Jr.; Murray, J.R.

    1993-01-01

    A general methodology is developed to experimentally characterize the spatial distribution of occupied traps in dielectric films on a semiconductor. The effects of parasitics such as leakage, charge transport through more than one interface, and interface trap charge are quantitatively addressed. Charge transport with contributions from multiple charge species is rigorously treated. The methodology is independent of the charge transport mechanism(s), and is directly applicable to multilayer dielectric structures. The centroid capacitance, rather than the centroid itself, is introduced as the fundamental quantity that permits the generic analysis of multilayer structures. In particular, the form of many equations describing stacked dielectric structures becomes independent of the number of layers comprising the stack if they are expressed in terms of the centroid capacitance and/or the flatband voltage. The experimental methodology is illustrated with an application using thermally stimulated current (TSC) measurements. The centroid of changes (via thermal emission) in the amount of trapped charge was determined for two different samples of a triple-layer dielectric structure. A direct consequence of the TSC analyses is the rigorous proof that changes in interface trap charge can contribute, though typically not significantly, to thermally stimulated current

  7. Tolerance limits and methodologies for IMRT measurement-based verification QA: Recommendations of AAPM Task Group No. 218.

    Science.gov (United States)

    Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A

    2018-04-01

    Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA

  8. Environmental and climate security: improving scenario methodologies for science and risk assessment

    Science.gov (United States)

    Briggs, C. M.; Carlsen, H.

    2010-12-01

    Governments and popular discussions have increasingly referred to concepts of ‘climate security’, often with reference to IPCC data. Development of effective methodologies to translate complex, scientific data into risk assessments has lagged, resulting in overly simplistic political assumptions of potential impacts. Climate security scenarios have been developed for use by security and military agencies, but effective engagement by scientific communities requires an improved framework. Effective use of data requires improvement both of climate projections, and the mapping of cascading impacts across interlinked, complex systems. In this research we propose a process for systematic generation of subsets of scenarios (of arbitrary size) from a given set of variables with possible interlinkages. The variables could include climatic changes as well as other global changes of concerns in a security context. In coping with possible challenges associated with the nexus of climate change and security - where deep structural uncertainty and possible irreversible changes are of primary interest - it is important to explore the outer limits of the relevant uncertainties. Therefore the proposed process includes a novel method that will help scenario developers in generating scenario sets where the scenarios are in a quantifiable sense maximally different and therefore best ‘span’ the whole set of scenarios. When downscaled onto a regional level, this process can provide guidance to potentially significant and abrupt geophysical changes, where high uncertainty has often prevented communication of risks. Potential physical changes can then be used as starting points for mapping cascading effects across networks, including topological analysis to identify critically vulnerable nodes and fragile systems, the existence of positive or negative feedback loops, and possible intervention points. Advanced knowledge of both potential geo-physical shifts and related non

  9. Improvement in the incident reporting and investigation procedures using process excellence (DMAI2C) methodology

    International Nuclear Information System (INIS)

    Miles, Elizabeth N.

    2006-01-01

    In 1996, Health and Safety introduced an incident investigation process called Learning to Look ( C) to Johnson and Johnson. This process provides a systematic way of analyzing work-related injuries and illness, uncovers root cause that leads to system defects, and points to viable solutions. The process analyzed involves three steps: investigation and reporting of the incident, determination of root cause, and development and implementation of a corrective action plan. The process requires the investigators to provide an initial communication for work-related serious injuries and illness as well as lost workday cases to Corporate Headquarters within 72h of the incident with a full investigative report to follow within 10 days. A full investigation requires a written report, a cause-result logic diagram (CRLD), a corrective action plan (CAP) and a report of incident costs (SafeCost) all due to be filed electronically. It is incumbent on the principal investigator and his or her investigative teams to assemble the various parts of the investigation and to follow up with the relevant parties to ensure corrective actions are implemented, and a full report submitted to Corporate executives. Initial review of the system revealed that the process was not working as designed. A number of reports were late, not signed by the business leaders, and in some instances, all cause were not identified. Process excellence was the process used to study the issue. The team used six sigma DMAI 2 C methodologies to identify and implement system improvements. The project examined the breakdown of the critical aspects of the reporting and investigation process that lead to system errors. This report will discuss the study findings, recommended improvements, and methods used to monitor the new improved process

  10. A Measurement Approach for Process Improvement | Woherem ...

    African Journals Online (AJOL)

    engineering project. They do so through planned rearchitecting of their processes to engender higher magnitudes of improvements. However, in almost all the organisations undertaking process re-engineering, they do so blindly without any means ...

  11. Three-dimensional sensing methodology combining stereo vision and phase-measuring profilometry based on dynamic programming

    Science.gov (United States)

    Lee, Hyunki; Kim, Min Young; Moon, Jeon Il

    2017-12-01

    Phase measuring profilometry and moiré methodology have been widely applied to the three-dimensional shape measurement of target objects, because of their high measuring speed and accuracy. However, these methods suffer from inherent limitations called a correspondence problem, or 2π-ambiguity problem. Although a kind of sensing method to combine well-known stereo vision and phase measuring profilometry (PMP) technique simultaneously has been developed to overcome this problem, it still requires definite improvement for sensing speed and measurement accuracy. We propose a dynamic programming-based stereo PMP method to acquire more reliable depth information and in a relatively small time period. The proposed method efficiently fuses information from two stereo sensors in terms of phase and intensity simultaneously based on a newly defined cost function of dynamic programming. In addition, the important parameters are analyzed at the view point of the 2π-ambiguity problem and measurement accuracy. To analyze the influence of important hardware and software parameters related to the measurement performance and to verify its efficiency, accuracy, and sensing speed, a series of experimental tests were performed with various objects and sensor configurations.

  12. A new transmission methodology for quality assurance in radiotherapy based on radiochromic film measurements

    Science.gov (United States)

    do Amaral, Leonardo L.; Pavoni, Juliana F.; Sampaio, Francisco; Netto, Thomaz Ghilardi

    2015-01-01

    Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source‐to‐detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92%±0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85%±0.26% and the mean percentage differences in the normalization point doses were −1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments. PACS number: 87.55.Qr, 87.55.km, 87.55.N‐ PMID

  13. A methodology for performing virtual measurements in a nuclear reactor system

    International Nuclear Information System (INIS)

    Ikonomopoulos, A.; Uhrig, R.E.; Tsoukalas, L.H.

    1992-01-01

    A novel methodology is presented for monitoring nonphysically measurable variables in an experimental nuclear reactor. It is based on the employment of artificial neural networks to generate fuzzy values. Neural networks map spatiotemporal information (in the form of time series) to algebraically defined membership functions. The entire process can be thought of as a virtual measurement. Through such virtual measurements the values of nondirectly monitored parameters with operational significance, e.g., transient-type, valve-position, or performance, can be determined. Generating membership functions is a crucial step in the development and practical utilization of fuzzy reasoning, a computational approach that offers the advantage of describing the state of the system in a condensed, linguistic form, convenient for monitoring, diagnostics, and control algorithms

  14. Dielectric Barrier Discharge (DBD) Plasma Actuators Thrust-Measurement Methodology Incorporating New Anti-Thrust Hypothesis

    Science.gov (United States)

    Ashpis, David E.; Laun, Matthew C.

    2014-01-01

    We discuss thrust measurements of Dielectric Barrier Discharge (DBD) plasma actuators devices used for aerodynamic active flow control. After a review of our experience with conventional thrust measurement and significant non-repeatability of the results, we devised a suspended actuator test setup, and now present a methodology of thrust measurements with decreased uncertainty. The methodology consists of frequency scans at constant voltages. The procedure consists of increasing the frequency in a step-wise fashion from several Hz to the maximum frequency of several kHz, followed by frequency decrease back down to the start frequency of several Hz. This sequence is performed first at the highest voltage of interest, then repeated at lower voltages. The data in the descending frequency direction is more consistent and selected for reporting. Sample results show strong dependence of thrust on humidity which also affects the consistency and fluctuations of the measurements. We also observed negative values of thrust or "anti-thrust", at low frequencies between 4 Hz and up to 64 Hz. The anti-thrust is proportional to the mean-squared voltage and is frequency independent. Departures from the parabolic anti-thrust curve are correlated with appearance of visible plasma discharges. We propose the anti-thrust hypothesis. It states that the measured thrust is a sum of plasma thrust and anti-thrust, and assumes that the anti-thrust exists at all frequencies and voltages. The anti-thrust depends on actuator geometry and materials and on the test installation. It enables the separation of the plasma thrust from the measured total thrust. This approach enables more meaningful comparisons between actuators at different installations and laboratories. The dependence on test installation was validated by surrounding the actuator with a large diameter, grounded, metal sleeve.

  15. Improving Measurement of Workplace Sexual Identity Management

    Science.gov (United States)

    Lance, Teresa S.; Anderson, Mary Z.; Croteau, James M.

    2010-01-01

    The purpose of this study was to advance measurement of sexual identity management for lesbian, gay, and bisexual workers. Psychometric properties of a revised version of the Workplace Sexual Identity Management Measure (WSIMM; Anderson, Croteau, Chung, & DiStefano, 2001) were examined on a sample of 64 predominantly White K-12 teachers.…

  16. The impact of multi-criteria performance measurement on business performance improvement

    OpenAIRE

    Kasie, Fentahun Moges; Belay, Alemu Moges

    2013-01-01

    Purpose: The purpose of this paper is to investigate the relationship between multi-criteria performance measurement (MCPM) practice and business performance improvement using the raw data collected from 33 selected manufacturing companies. In addition, it proposes modified MCPM model as an effective approach to improve business performance of manufacturing companies. Design/methodology/approach:Research paper. Primary and secondary data were collected using questionnaire survey, interview an...

  17. Methodological issues in systematic reviews of headache trials: adapting historical diagnostic classifications and outcome measures to present-day standards.

    Science.gov (United States)

    McCrory, Douglas C; Gray, Rebecca N; Tfelt-Hansen, Peer; Steiner, Timothy J; Taylor, Frederick R

    2005-05-01

    Recent efforts to make headache diagnostic classification and clinical trial methodology more consistent provide valuable advice to trialists generating new evidence on effectiveness of treatments for headache; however, interpreting older trials that do not conform to new standards remains problematic. Systematic reviewers seeking to utilize historical data can adapt currently recommended diagnostic classification and clinical trial methodological approaches to interpret all available data relative to current standards. In evaluating study populations, systematic reviewers can: (i) use available data to attempt to map study populations to diagnoses in the new International Classification of Headache Disorders; and (ii) stratify analyses based on the extent to which study populations are precisely specified. In evaluating outcome measures, systematic reviewers can: (i) summarize prevention studies using headache frequency, incorporating headache index in a stratified analysis if headache frequency is not available; (ii) summarize acute treatment studies using pain-free response as reported in directly measured headache improvement or headache severity outcomes; and (iii) avoid analysis of recurrence or relapse data not conforming to the sustained pain-free response definition.

  18. Comparing Classic and Interval Analytical Hierarchy Process Methodologies for Measuring Area-Level Deprivation to Analyze Health Inequalities.

    Science.gov (United States)

    Cabrera-Barona, Pablo; Ghorbanzadeh, Omid

    2018-01-16

    Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas.

  19. Employing response surface methodology (RSM) to improve methane production from cotton stalk.

    Science.gov (United States)

    Zhang, Han; Khalid, Habiba; Li, Wanwu; He, Yanfeng; Liu, Guangqing; Chen, Chang

    2018-03-01

    China is the largest cotton producer with the cotton output accounting for 25% of the total world's cotton production. A large quantity of cotton stalk (CS) waste is generated which is burned and causes environmental and ecological problems. This study investigated the anaerobic digestibility of CS by focusing on improving the methane yield by applying central composite design of response surface methodology (RSM). The purpose of this study was to determine the best level of factors to optimize the desired output of methane production from CS. Thus, it was necessary to describe the relationship of many individual variables with one or more response values for the effective utilization of CS. The influences of feed to inoculum (F/I) ratio and organic loading (OL) on methane production were investigated. Results showed that the experimental methane yield (EMY) and volatile solid (VS) removal were calculated to be 70.22 mL/gVS and 14.33% at F/I ratio of 0.79 and organic loading of 25.61 gVS/L, respectively. Characteristics of final effluent showed that the anaerobic system was stable. This research laid a foundation for future application of CS to alleviate the problems of waste pollution and energy output.

  20. Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Water Security

    Science.gov (United States)

    Hakimdavar, Raha; Wood, Danielle; Eylander, John; Peters-Lidard, Christa; Smith, Jane; Doorn, Brad; Green, David; Hummel, Corey; Moore, Thomas C.

    2018-01-01

    River basins for which transboundary coordination and governance is a factor are of concern to US national security, yet there is often a lack of sufficient data-driven information available at the needed time horizons to inform transboundary water decision-making for the intelligence, defense, and foreign policy communities. To address this need, a two-day workshop entitled Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Global Water Security was held in August 2017 in Maryland. The committee that organized and convened the workshop (the Organizing Committee) included representatives from the National Aeronautics and Space Administration (NASA), the US Army Corps of Engineers Engineer Research and Development Center (ERDC), and the US Air Force. The primary goal of the workshop was to advance knowledge on the current US Government and partners' technical information needs and gaps to support national security interests in relation to transboundary water. The workshop also aimed to identify avenues for greater communication and collaboration among the scientific, intelligence, defense, and foreign policy communities. The discussion around transboundary water was considered in the context of the greater global water challenges facing US national security.

  1. Selecting Health Care Improvement Projects: A Methodology Integrating Cause-and-Effect Diagram and Analytical Hierarchy Process.

    Science.gov (United States)

    Testik, Özlem Müge; Shaygan, Amir; Dasdemir, Erdi; Soydan, Guray

    It is often vital to identify, prioritize, and select quality improvement projects in a hospital. Yet, a methodology, which utilizes experts' opinions with different points of view, is needed for better decision making. The proposed methodology utilizes the cause-and-effect diagram to identify improvement projects and construct a project hierarchy for a problem. The right improvement projects are then prioritized and selected using a weighting scheme of analytical hierarchy process by aggregating experts' opinions. An approach for collecting data from experts and a graphical display for summarizing the obtained information are also provided. The methodology is implemented for improving a hospital appointment system. The top-ranked 2 major project categories for improvements were identified to be system- and accessibility-related causes (45%) and capacity-related causes (28%), respectively. For each of the major project category, subprojects were then ranked for selecting the improvement needs. The methodology is useful in cases where an aggregate decision based on experts' opinions is expected. Some suggestions for practical implementations are provided.

  2. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  3. Limitations and improvements for harmonic generation measurements

    International Nuclear Information System (INIS)

    Best, Steven; Croxford, Anthony; Neild, Simon

    2014-01-01

    A typical acoustic harmonic generation measurement comes with certain limitations. Firstly, the use of the plane wave-based analysis used to extract the nonlinear parameter, β, ignores the effects of diffraction, attenuation and receiver averaging which are common to most experiments, and may therefore limit the accuracy of a measurement. Secondly, the method usually requires data obtained from a through-transmission type setup, which may not be practical in a field measurement scenario where access to the component is limited. Thirdly, the technique lacks a means of pinpointing areas of damage in a component, as the measured nonlinearity represents an average over the length of signal propagation. Here we describe a three-dimensional model of harmonic generation in a sound beam, which is intended to provide a more realistic representation of a typical experiment. The presence of a reflecting boundary is then incorporated into the model to assess the feasibility of performing single-sided measurements. Experimental validation is provided where possible. Finally, a focusing acoustic source is modelled to provide a theoretical indication of the afforded advantages when the nonlinearity is localized

  4. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    International Nuclear Information System (INIS)

    2013-01-01

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results

  5. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-12-15

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results.

  6. Improving the Quality of Experience Journals: Training Educational Psychology Students in Basic Qualitative Methodology

    Science.gov (United States)

    Reynolds-Keefer, Laura

    2010-01-01

    This study evaluates the impact of teaching basic qualitative methodology to preservice teachers enrolled in an educational psychology course in the quality of observation journals. Preservice teachers enrolled in an educational psychology course requiring 45 hr of field experience were given qualitative methodological training as a part of the…

  7. Methodological challenges surrounding direct-to-consumer advertising research--the measurement conundrum.

    Science.gov (United States)

    Hansen, Richard A; Droege, Marcus

    2005-06-01

    Numerous studies have focused on the impact of direct-to-consumer (DTC) prescription drug advertising on consumer behavior and health outcomes. These studies have used various approaches to assess exposure to prescription drug advertising and to measure the subsequent effects of such advertisements. The objectives of this article are to (1) discuss measurement challenges involved in DTC advertising research, (2) summarize measurement approaches commonly identified in the literature, and (3) discuss contamination, time to action, and endogeneity as specific problems in measurement design and application. We conducted a review of the professional literature to identify illustrative approaches to advertising measurement. Specifically, our review of the literature focused on measurement of DTC advertising exposure and effect. We used the hierarchy-of-effects model to guide our discussion of processing and communication effects. Other effects were characterized as target audience action, sales, market share, and profit. Overall, existing studies have used a variety of approaches to measure advertising exposure and effect, yet the ability of measures to produce a valid and reliable understanding of the effects of DTC advertising can be improved. Our review provides a framework for conceptualizing DTC measurement, and can be used to identify gaps in the literature not sufficiently addressed by existing measures. Researchers should continue to explore correlations between exposure and effect of DTC advertising, but are obliged to improve and validate measurement in this area.

  8. Energy upgrading measures improve also indoor climate

    DEFF Research Database (Denmark)

    Foldbjerg, Peter; Knudsen, Henrik Nellemose

    2014-01-01

    A new survey shows that the economy is what motivates Danish owners of single-family houses the most to start energy upgrading, and that improved indoor climate is also an important factor. After the upgrading, homeowners experience both improved economy and indoor climate. In a strategy...... to increase the number of homeowners who venture into a major energy upgrading of their house, the demonstrated positive side effects, more than energy savings, should be included in the communication to motivate homeowners. The barriers should be reduced by “taking the homeowners by the hand” and helping...... them to choose relevant energy-saving solutions as well as clarifying the financial consequences and opportunities....

  9. Improving Outcome Measures Other Than Achievement

    Directory of Open Access Journals (Sweden)

    Kristin Anderson Moore

    2015-05-01

    Full Text Available Research indicates that educational, economic, and life success reflect children’s nonacademic as well as academic competencies. Therefore, longitudinal surveys that assess educational progress and success need to incorporate nonacademic measures to avoid omitted variable bias, inform development of new intervention strategies, and support mediating and moderating analyses. Based on a life course model and a whole child perspective, this article suggests constructs in the domains of child health, emotional/psychological development, educational achievement/attainment, social behavior, and social relationships. Four critical constructs are highlighted: self-regulation, agency/motivation, persistence/diligence, and executive functioning. Other constructs that are currently measured need to be retained, including social skills, positive relationships, activities, positive behaviors, academic self-efficacy, educational engagement, and internalizing/emotional well-being. Examples of measures that are substantively and psychometrically robust are provided.

  10. POSSIBILITY OF IMPROVING EXISTING STANDARDS AND METHODOLOGIES FOR AUDITING INFORMATION SYSTEMS TO PROVIDE E-GOVERNMENT SERVICES

    Directory of Open Access Journals (Sweden)

    Евгений Геннадьевич Панкратов

    2014-03-01

    Full Text Available This article analyzes the existing methods of e-government systems audit, their shortcomings are examined.  The approaches to improve existing techniques and adapt them to the specific characteristics of e-government systems are suggested. The paper describes the methodology, providing possibilities of integrated assessment of information systems. This methodology uses systems maturity models and can be used in the construction of e-government rankings, as well as in the audit of their implementation process. Maturity models are based on COBIT, COSO methodologies and models of e-government, developed by the relevant committee of the UN. The methodology was tested during the audit of information systems involved in the payment of temporary disability benefits. The audit was carried out during analysis of the outcome of the pilot project for the abolition of the principle of crediting payments for disability benefits.DOI: http://dx.doi.org/10.12731/2218-7405-2014-2-5

  11. An ultrasonic methodology for muscle cross section measurement of support space flight

    Science.gov (United States)

    Hatfield, Thomas R.; Klaus, David M.; Simske, Steven J.

    2004-09-01

    The number one priority for any manned space mission is the health and safety of its crew. The study of the short and long term physiological effects on humans is paramount to ensuring crew health and mission success. One of the challenges associated in studying the physiological effects of space flight on humans, such as loss of bone and muscle mass, has been that of readily attaining the data needed to characterize the changes. The small sampling size of astronauts, together with the fact that most physiological data collection tends to be rather tedious, continues to hinder elucidation of the underlying mechanisms responsible for the observed changes that occur in space. Better characterization of the muscle loss experienced by astronauts requires that new technologies be implemented. To this end, we have begun to validate a 360° ultrasonic scanning methodology for muscle measurements and have performed empirical sampling of a limb surrogate for comparison. Ultrasonic wave propagation was simulated using 144 stations of rotated arm and calf MRI images. These simulations were intended to provide a preliminary check of the scanning methodology and data analysis before its implementation with hardware. Pulse-echo waveforms were processed for each rotation station to characterize fat, muscle, bone, and limb boundary interfaces. The percentage error between MRI reference values and calculated muscle areas, as determined from reflection points for calf and arm cross sections, was -2.179% and +2.129%, respectively. These successful simulations suggest that ultrasound pulse scanning can be used to effectively determine limb cross-sectional areas. Cross-sectional images of a limb surrogate were then used to simulate signal measurements at several rotation angles, with ultrasonic pulse-echo sampling performed experimentally at the same stations on the actual limb surrogate to corroborate the results. The objective of the surrogate sampling was to compare the signal

  12. Improved cosine similarity measures of simplified neutrosophic setsfor medical diagnoses

    OpenAIRE

    Jun Ye

    2014-01-01

    In pattern recognition and medical diagnosis, similarity measure is an important mathematicaltool. To overcome some disadvantages of existing cosine similarity measures of simplified neutrosophicsets (SNSs) in vector space, this paper proposed improved cosine similarity measures of SNSs based oncosine function, including single valued neutrosophic cosine similarity measures and interval neutro-sophic cosine similarity measures. Then, weighted cosine similarity measures of SNSs were introduced...

  13. Improved thomas formula for radon measurement

    International Nuclear Information System (INIS)

    Ji Changsong

    1991-06-01

    The FT 648 type portable absolute radon meter has been developed and the designing principle of this instrument is introduced. The absolute radon meter differs from relative radon meter. By using structure parameters, operating parameters and readout of this instrument, the radon content of measured gas is obtained directly without calibration in advance. Normally, the calibration is done by a standard radioactive gaseous source of which the radon concentration is known. The systematic error is removed by adding filter-efficiency Σ, α self-absorption correction β, energy spectrum correction S, geometric factor Ω of probe and gravity dropping correction factor G to the Thomas formula for radon measurement of two-filter method. The atmosphere radon content, which is given in hour-average, in Beijing area was measured by FT 648 type absolute radon meter. The measurement lasted continuously for several days and nights and a 'saddle shape' of radon content-time curve was observed. The day's average radon content was 8.5 Bq·m -3

  14. Tools for Measuring and Improving Performance.

    Science.gov (United States)

    Jurow, Susan

    1993-01-01

    Explains the need for meaningful performance measures in libraries and the Total Quality Management (TQM) approach to data collection. Five tools representing different stages of a TQM inquiry are covered (i.e., the Shewhart Cycle, flowcharts, cause-and-effect diagrams, Pareto charts, and control charts), and benchmarking is addressed. (Contains…

  15. A Methodology to Integrate Magnetic Resonance and Acoustic Measurements for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Parra, Jorge O.; Hackert, Chris L.; Collier, Hughbert A.; Bennett, Michael

    2002-01-29

    The objective of this project was to develop an advanced imaging method, including pore scale imaging, to integrate NMR techniques and acoustic measurements to improve predictability of the pay zone in hydrocarbon reservoirs. This is accomplished by extracting the fluid property parameters using NMR laboratory measurements and the elastic parameters of the rock matrix from acoustic measurements to create poroelastic models of different parts of the reservoir. Laboratory measurement techniques and core imaging are being linked with a balanced petrographical analysis of the core and theoretical model.

  16. Ultrasonic particle image velocimetry for improved flow gradient imaging: algorithms, methodology and validation

    International Nuclear Information System (INIS)

    Niu Lili; Qian Ming; Yu Wentao; Jin Qiaofeng; Ling Tao; Zheng Hairong; Wan Kun; Gao Shen

    2010-01-01

    This paper presents a new algorithm for ultrasonic particle image velocimetry (Echo PIV) for improving the flow velocity measurement accuracy and efficiency in regions with high velocity gradients. The conventional Echo PIV algorithm has been modified by incorporating a multiple iterative algorithm, sub-pixel method, filter and interpolation method, and spurious vector elimination algorithm. The new algorithms' performance is assessed by analyzing simulated images with known displacements, and ultrasonic B-mode images of in vitro laminar pipe flow, rotational flow and in vivo rat carotid arterial flow. Results of the simulated images show that the new algorithm produces much smaller bias from the known displacements. For laminar flow, the new algorithm results in 1.1% deviation from the analytically derived value, and 8.8% for the conventional algorithm. The vector quality evaluation for the rotational flow imaging shows that the new algorithm produces better velocity vectors. For in vivo rat carotid arterial flow imaging, the results from the new algorithm deviate 6.6% from the Doppler-measured peak velocities averagely compared to 15% of that from the conventional algorithm. The new Echo PIV algorithm is able to effectively improve the measurement accuracy in imaging flow fields with high velocity gradients.

  17. Improvement an enterprises marketing performance measurement system

    Directory of Open Access Journals (Sweden)

    Stanković Ljiljana

    2013-01-01

    Full Text Available Business conditions in which modern enterprises do business are more and more complex. The complexity of the business environment is caused by activities of external and internal factors, which imposes the need for the turn in management focus. One of key turns is related to the need of adaptation and development of new business performance evaluation systems. The evaluation of marketing contribution to business performance is very important however a complex task as well. The marketing theory and practice indicates the need for developing adequate standards and systems for evaluating the efficiency of marketing decisions. The better understanding of marketing standards and ways that managers use is a very important factor that affects the efficiency of strategic decision-making. The paper presents the results of researching the way in which managers perceive and apply marketing performance measures. The data that were received through the field research sample enabled the consideration of the managers' attitudes on practical ways of implementing marketing performance measurement and identifying measures that managers imply as used mostly in business practice.

  18. Prediction of work metabolism from heart rate measurements in forest work: some practical methodological issues.

    Science.gov (United States)

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Auger, Isabelle; Leone, Mario

    2015-01-01

    Individual heart rate (HR) to workload relationships were determined using 93 submaximal step-tests administered to 26 healthy participants attending physical activities in a university training centre (laboratory study) and 41 experienced forest workers (field study). Predicted maximum aerobic capacity (MAC) was compared to measured MAC from a maximal treadmill test (laboratory study) to test the effect of two age-predicted maximum HR Equations (220-age and 207-0.7 × age) and two clothing insulation levels (0.4 and 0.91 clo) during the step-test. Work metabolism (WM) estimated from forest work HR was compared against concurrent work V̇O2 measurements while taking into account the HR thermal component. Results show that MAC and WM can be accurately predicted from work HR measurements and simple regression models developed in this study (1% group mean prediction bias and up to 25% expected prediction bias for a single individual). Clothing insulation had no impact on predicted MAC nor age-predicted maximum HR equations. Practitioner summary: This study sheds light on four practical methodological issues faced by practitioners regarding the use of HR methodology to assess WM in actual work environments. More specifically, the effect of wearing work clothes and the use of two different maximum HR prediction equations on the ability of a submaximal step-test to assess MAC are examined, as well as the accuracy of using an individual's step-test HR to workload relationship to predict WM from HR data collected during actual work in the presence of thermal stress.

  19. An Improved Methodology to Overcome Key Issues in Human Fecal Metagenomic DNA Extraction

    Directory of Open Access Journals (Sweden)

    Jitendra Kumar

    2016-12-01

    Full Text Available Microbes are ubiquitously distributed in nature, and recent culture-independent studies have highlighted the significance of gut microbiota in human health and disease. Fecal DNA is the primary source for the majority of human gut microbiome studies. However, further improvement is needed to obtain fecal metagenomic DNA with sufficient amount and good quality but low host genomic DNA contamination. In the current study, we demonstrate a quick, robust, unbiased, and cost-effective method for the isolation of high molecular weight (>23 kb metagenomic DNA (260/280 ratio >1.8 with a good yield (55.8 ± 3.8 ng/mg of feces. We also confirm that there is very low human genomic DNA contamination (eubacterial: human genomic DNA marker genes = 227.9:1 in the human feces. The newly-developed method robustly performs for fresh as well as stored fecal samples as demonstrated by 16S rRNA gene sequencing using 454 FLX+. Moreover, 16S rRNA gene analysis indicated that compared to other DNA extraction methods tested, the fecal metagenomic DNA isolated with current methodology retains species richness and does not show microbial diversity biases, which is further confirmed by qPCR with a known quantity of spike-in genomes. Overall, our data highlight a protocol with a balance between quality, amount, user-friendliness, and cost effectiveness for its suitability toward usage for culture-independent analysis of the human gut microbiome, which provides a robust solution to overcome key issues associated with fecal metagenomic DNA isolation in human gut microbiome studies.

  20. Methodology for sample preparation and size measurement of commercial ZnO nanoparticles

    Directory of Open Access Journals (Sweden)

    Pei-Jia Lu

    2018-04-01

    Full Text Available This study discusses the strategies on sample preparation to acquire images with sufficient quality for size characterization by scanning electron microscope (SEM using two commercial ZnO nanoparticles of different surface properties as a demonstration. The central idea is that micrometer sized aggregates of ZnO in powdered forms need to firstly be broken down to nanosized particles through an appropriate process to generate nanoparticle dispersion before being deposited on a flat surface for SEM observation. Analytical tools such as contact angle, dynamic light scattering and zeta potential have been utilized to optimize the procedure for sample preparation and to check the quality of the results. Meanwhile, measurements of zeta potential values on flat surfaces also provide critical information and save lots of time and efforts in selection of suitable substrate for particles of different properties to be attracted and kept on the surface without further aggregation. This simple, low-cost methodology can be generally applied on size characterization of commercial ZnO nanoparticles with limited information from vendors. Keywords: Zinc oxide, Nanoparticles, Methodology

  1. Measuring domestic water use: a systematic review of methodologies that measure unmetered water use in low-income settings.

    Science.gov (United States)

    Tamason, Charlotte C; Bessias, Sophia; Villada, Adriana; Tulsiani, Suhella M; Ensink, Jeroen H J; Gurley, Emily S; Mackie Jensen, Peter Kjaer

    2016-11-01

    To present a systematic review of methods for measuring domestic water use in settings where water meters cannot be used. We systematically searched EMBASE, PubMed, Water Intelligence Online, Water Engineering and Development Center, IEEExplore, Scielo, and Science Direct databases for articles that reported methodologies for measuring water use at the household level where water metering infrastructure was absent or incomplete. A narrative review explored similarities and differences between the included studies and provide recommendations for future research in water use. A total of 21 studies were included in the review. Methods ranged from single-day to 14-consecutive-day visits, and water use recall ranged from 12 h to 7 days. Data were collected using questionnaires, observations or both. Many studies only collected information on water that was carried into the household, and some failed to mention whether water was used outside the home. Water use in the selected studies was found to range from two to 113 l per capita per day. No standardised methods for measuring unmetered water use were found, which brings into question the validity and comparability of studies that have measured unmetered water use. In future studies, it will be essential to define all components that make up water use and determine how they will be measured. A pre-study that involves observations and direct measurements during water collection periods (these will have to be determined through questioning) should be used to determine optimal methods for obtaining water use information in a survey. Day-to-day and seasonal variation should be included. A study that investigates water use recall is warranted to further develop standardised methods to measure water use; in the meantime, water use recall should be limited to 24 h or fewer. © 2016 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  2. A methodological frame for assessing benzene induced leukemia risk mitigation due to policy measures

    International Nuclear Information System (INIS)

    Karakitsios, Spyros P.; Sarigiannis, Dimosthenis A.; Gotti, Alberto; Kassomenos, Pavlos A.; Pilidis, Georgios A.

    2013-01-01

    The study relies on the development of a methodology for assessing the determinants that comprise the overall leukemia risk due to benzene exposure and how these are affected by outdoor and indoor air quality regulation. An integrated modeling environment was constructed comprising traffic emissions, dispersion models, human exposure models and a coupled internal dose/biology-based dose–response risk assessment model, in order to assess the benzene imposed leukemia risk, as much as the impact of traffic fleet renewal and smoking banning to these levels. Regarding traffic fleet renewal, several “what if” scenarios were tested. The detailed full-chain methodology was applied in a South-Eastern European urban setting in Greece and a limited version of the methodology in Helsinki. Non-smoking population runs an average risk equal to 4.1 · 10 −5 compared to 23.4 · 10 −5 for smokers. The estimated lifetime risk for the examined occupational groups was higher than the one estimated for the general public by 10–20%. Active smoking constitutes a dominant parameter for benzene-attributable leukemia risk, much stronger than any related activity, occupational or not. From the assessment of mitigation policies it was found that the associated leukemia risk in the optimum traffic fleet scenario could be reduced by up to 85% for non-smokers and up to 8% for smokers. On the contrary, smoking banning provided smaller gains for (7% for non-smokers, 1% for smokers), while for Helsinki, smoking policies were found to be more efficient than traffic fleet renewal. The methodology proposed above provides a general framework for assessing aggregated exposure and the consequent leukemia risk from benzene (incorporating mechanistic data), capturing exposure and internal dosimetry dynamics, translating changes in exposure determinants to actual changes in population risk, providing a valuable tool for risk management evaluation and consequently to policy support. - Highlights

  3. A methodological frame for assessing benzene induced leukemia risk mitigation due to policy measures

    Energy Technology Data Exchange (ETDEWEB)

    Karakitsios, Spyros P. [Aristotle University of Thessaloniki, Department of Chemical Engineering, 54124 Thessaloniki (Greece); Sarigiannis, Dimosthenis A., E-mail: denis@eng.auth.gr [Aristotle University of Thessaloniki, Department of Chemical Engineering, 54124 Thessaloniki (Greece); Centre for Research and Technology Hellas (CE.R.T.H.), 57001, Thessaloniki (Greece); Gotti, Alberto [Centre for Research and Technology Hellas (CE.R.T.H.), 57001, Thessaloniki (Greece); Kassomenos, Pavlos A. [University of Ioannina, Department of Physics, Laboratory of Meteorology, GR-45110 Ioannina (Greece); Pilidis, Georgios A. [University of Ioannina, Department of Biological Appl. and Technologies, GR-45110 Ioannina (Greece)

    2013-01-15

    The study relies on the development of a methodology for assessing the determinants that comprise the overall leukemia risk due to benzene exposure and how these are affected by outdoor and indoor air quality regulation. An integrated modeling environment was constructed comprising traffic emissions, dispersion models, human exposure models and a coupled internal dose/biology-based dose–response risk assessment model, in order to assess the benzene imposed leukemia risk, as much as the impact of traffic fleet renewal and smoking banning to these levels. Regarding traffic fleet renewal, several “what if” scenarios were tested. The detailed full-chain methodology was applied in a South-Eastern European urban setting in Greece and a limited version of the methodology in Helsinki. Non-smoking population runs an average risk equal to 4.1 · 10{sup −5} compared to 23.4 · 10{sup −5} for smokers. The estimated lifetime risk for the examined occupational groups was higher than the one estimated for the general public by 10–20%. Active smoking constitutes a dominant parameter for benzene-attributable leukemia risk, much stronger than any related activity, occupational or not. From the assessment of mitigation policies it was found that the associated leukemia risk in the optimum traffic fleet scenario could be reduced by up to 85% for non-smokers and up to 8% for smokers. On the contrary, smoking banning provided smaller gains for (7% for non-smokers, 1% for smokers), while for Helsinki, smoking policies were found to be more efficient than traffic fleet renewal. The methodology proposed above provides a general framework for assessing aggregated exposure and the consequent leukemia risk from benzene (incorporating mechanistic data), capturing exposure and internal dosimetry dynamics, translating changes in exposure determinants to actual changes in population risk, providing a valuable tool for risk management evaluation and consequently to policy support

  4. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  5. Successful Technology Commercialization – Yes or No? Improving the Odds. The Quick Look Methodology and Process

    OpenAIRE

    Pletcher, Gary; Zehner II, William Bradley

    2017-01-01

    This article explores the relationships which transform new scientific knowledge into new commercial products, services, and ventures to create wealth creation. The major technology and marketing commercialization dilemmas are defined and addressed. The Quicklook methodology and related processes to quickly assess the commercial viability and potential of a scientific research project is explained. Using the Quicklook methodology and process early in the research and development process i...

  6. Reduction of Complications of Local Anaesthesia in Dental Healthcare Setups by Application of the Six Sigma Methodology: A Statistical Quality Improvement Technique.

    Science.gov (United States)

    Akifuddin, Syed; Khatoon, Farheen

    2015-12-01

    Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.

  7. Double Chooz Improved Multi-Detector Measurements

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The Double Chooz experiment (DC) is a reactor neutrino oscillation experiment running at Chooz nuclear power plant (2 reactors) in France. In 2011, DC first reported indication of non-zero θ13 with the far detector (FD) located at the maximum of oscillation effects (i.e. disappearance), thus challenging the CHOOZ non-observation limit. A robust observation of θ13 followed in 2012 by the Daya Bay experiments with multiple detector configurations. Since 2015 DC runs in a multi-detector configuration making thus the impact of several otherwise dominating systematics reduce strongly. DC’s unique almost "iso-flux" site, allows the near detector (ND) to become a direct accurate non-oscillation reference to the FD. Our first multi-detector results at MORIOND-2016 showed an intriguing deviation of θ13 with respect to the world average. We will address this issue in this seminar. The combined "reactor-θ13" measurement is expected to ...

  8. [Reducing inequality by improving preventing measures].

    Science.gov (United States)

    Valsecchi, M

    2014-01-01

    Terms of inequalities issue in health service are defined and the consolidated scientific acquisitions are recalled. Three prioritary areas of action are defined and described, that Prevention Departments are suggested to activate through focused programs in order to reduce specific inequalities. First area of action: includes three types of vital interventions: vaccinations, contrasting of tuberculosis infection and oncological screening that have to be granted to specific disadvantaged groups of population as Rom communities, immigrant women, prisoners e psychiatric patients. Second area of action: actions on focused urban planning aiming to improve conditions of social housing (with a special focus on thermal insulation, minimal distances to be kept towards streets of havy traffic), increase of increase of urban green spaces enjoyed by the population and contrasting degradation of housing (with particular attention to poisoning by carbon monoxide). Third area of action: actions contrasting cardiovascular diseases, that is the leading cause of death and inequalities in health for the working class population. A coordinated intervention directly in the workplace is proposed, where a particularly high percentage of individuals exposed to specific risk factors is present.

  9. A methodology for the measure of secondary homes tourist flows at municipal level

    Directory of Open Access Journals (Sweden)

    Andrea Guizzardi

    2007-10-01

    Full Text Available The present public statistical system does not provide information concerning second houses touristic flows at sub-regional level. The lack limits local administrations' capabilities to take decisions about either: environmental, territorial and productive development, as well as regional governments in fair allocation of public financing. In the work, this information lack is overcome by proposing an indirect estimation methodology. Municipalities electric power consumption is proposed as an indicator of the stays on secondary homes. The indicator is connected to tourism flows considering both measurement errors and factors, modifying the local power demand. The application to Emilia-Romagna regional case allow to verify results’ coherence with officials statistics, as weel as to assess municipalities’ tourist vocation.

  10. Providing hierarchical approach for measuring supply chain performance using AHP and DEMATEL methodologies

    Directory of Open Access Journals (Sweden)

    Ali Najmi

    2010-06-01

    Full Text Available Measuring the performance of a supply chain is normally of a function of various parameters. Such a problem often involves in a multiple criteria decision making (MCMD problem where different criteria need to be defined and calculated, properly. During the past two decades, Analytical hierarchy procedure (AHP and DEMATEL have been some of the most popular MCDM approaches for prioritizing various attributes. The study of this paper uses a new methodology which is a combination of AHP and DEMATEL to rank various parameters affecting the performance of the supply chain. The DEMATEL is used for understanding the relationship between comparison metrics and AHP is used for the integration to provide a value for the overall performance.

  11. Health Data Entanglement and artificial intelligence-based analysis: a brand new methodology to improve the effectiveness of healthcare services.

    Science.gov (United States)

    Capone, A; Cicchetti, A; Mennini, F S; Marcellusi, A; Baio, G; Favato, G

    2016-01-01

    Healthcare expenses will be the most relevant policy issue for most governments in the EU and in the USA. This expenditure can be associated with two major key categories: demographic and economic drivers. Factors driving healthcare expenditure were rarely recognised, measured and comprehended. An improvement of health data generation and analysis is mandatory, and in order to tackle healthcare spending growth, it may be useful to design and implement an effective, advanced system to generate and analyse these data. A methodological approach relied upon the Health Data Entanglement (HDE) can be a suitable option. By definition, in the HDE a large amount of data sets having several sources are functionally interconnected and computed through learning machines that generate patterns of highly probable future health conditions of a population. Entanglement concept is borrowed from quantum physics and means that multiple particles (information) are linked together in a way such that the measurement of one particle's quantum state (individual health conditions and related economic requirements) determines the possible quantum states of other particles (population health forecasts to predict their impact). The value created by the HDE is based on the combined evaluation of clinical, economic and social effects generated by health interventions. To predict the future health conditions of a population, analyses of data are performed using self-learning AI, in which sequential decisions are based on Bayesian algorithmic probabilities. HDE and AI-based analysis can be adopted to improve the effectiveness of the health governance system in ways that also lead to better quality of care.

  12. Improving Scientific Research Methodology in Undergraduate Medical Students: a case of team based training blended in a research project

    Institute of Scientific and Technical Information of China (English)

    W.Zhang; C.Cambier; Y.Zhang; J.M.Vandeweerd; P.Gustin

    2014-01-01

    An educational intervention targeting medical students and aiming to develop skills useful to the writing of a health science research protocol over a short period of time has been developed in the Shanghai Jiao Tong University School of Medicine. The methodology blending the principles of PBL and TBL is detailed and key issues of this implementation are discussed. Twenty-one students were enrolled in a research master degree and participated to three mandatory 180-minutes sessions. Beyond classical skills useful to solve a problem, this new intervention focused on the transformation of knowledge to create an authentic content, which is a feature of the project-based learning(PBL). The training process was designed according to team-based learning(TBL) procedure except that work sharing between groups and pooling resources and outcomes of each group allowed the construction of one final class original research project in the field of respiratory pharmacology. The combination of both learning methods allowed promoting individual and group accountability necessary to improve self-learning and the quality of the final joint project. The peer reviewing was an essential factor in creating the students’ motivation and improving of team discussion. The grades individually assigned for skills and quality of the project by an external teacher suggested that key objectives of the intervention were reached. In conclusion, the educational intervention described in this paper appears as an appropriate method to develop specific skills necessary to write and discuss a research project within a research team.Further works are necessary to measure the degree of satisfaction of students and improvement of performance.

  13. Code coverage measurement methodology for MMI software of safety-class I and C system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eun Hyung; Jung, Beom Young; Choi, Seok Joo [Suresofttech, Seoul (Korea, Republic of)

    2016-10-15

    MMI (Man-Machine Interface) software of the safety instrumentation and control system used in nuclear power plants carry out an important functions, such as displaying and transmitting the commend to another system, and change setpoints the safety-related information. Yet, this has been recognized reliability of the MMI software plays an important role in enhancing nuclear power plants are operating, regulatory standards have been strengthened with it. Strengthening of regulatory standards has affected even perform software testing soon, and accordingly, the current regulatory require the measurement of code coverage with legal standard. In this paper, it poses a problem of the conventional method used for measuring the above-mentioned code coverage, presents a new coverage measuring method for solving the exposed problems. In this paper, we checked the problems such as limit and the low efficiency of the existing test coverage measuring method on the MMI software using in nuclear power instrumentation and control systems, and it proposed a new test coverage measuring method as a solution for this. If you apply a new method of Top-Down approach, can mitigate all of the problems of existing test coverage measurement methods and possible coverage achievement of the desired objectives. Of course, it is still necessary to secure more cases, and the methodology should be systematization based on the cases. Thus, if later the efficient and reliable are ensured through the application in many cases, as well as nuclear power instrumentation and control, may be used to ensure code coverage of software of the many areas where the GUI is utilized.

  14. Development and Attestation of Gamma-Ray Measurement Methodologies for use by Rostekhnadzor Inspectors in the Russian Federation

    International Nuclear Information System (INIS)

    Jeff Sanders

    2006-01-01

    Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revision of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation

  15. Exhaled nitric oxide measurements in the first 2 years of life: methodological issues, clinical and epidemiological applications

    Directory of Open Access Journals (Sweden)

    de Benedictis Fernando M

    2009-07-01

    Full Text Available Abstract Fractional exhaled nitric oxide (FeNO is a useful tool to diagnose and monitor eosinophilic bronchial inflammation in asthmatic children and adults. In children younger than 2 years of age FeNO has been successfully measured both with the tidal breathing and with the single breath techniques. However, there are a number of methodological issues that need to be addressed in order to increase the reproducibility of the FeNO measurements within and between infants. Indeed, a standardized method to measure FeNO in the first 2 years of life would be extremely useful in order to meaningfully interpret FeNO values in this age group. Several factors related to the measurement conditions have been found to influence FeNO, such as expiratory flow, ambient NO and nasal contamination. Furthermore, the exposure to pre- and postnatal risk factors for respiratory morbidity has been shown to influence FeNO values. Therefore, these factors should always be assessed and their association with FeNO values in the specific study population should be evaluated and, eventually, controlled for. There is evidence consistently suggesting that FeNO is increased in infants with family history of atopy/atopic diseases and in infants with recurrent wheezing. These findings could support the hypothesis that eosinophilic bronchial inflammation is present at an early stage in those infants at increased risk of developing persistent respiratory symptoms and asthma. Furthermore, it has been shown that FeNO measurements could represent a useful tool to assess bronchial inflammation in other airways diseases, such as primary ciliary dyskinesia, bronchopulmonary dysplasia and cystic fibrosis. Further studies are needed in order to improve the reproducibility of the measurements, and large prospective studies are warranted in order to evaluate whether FeNO values measured in the first years of life can predict the future development of asthma or other respiratory diseases.

  16. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  17. The quality infrastructure measuring, analyzing, and improving library services

    CERN Document Server

    Murphy, Sarah Anne

    2013-01-01

    Summarizing specific tools for measuring service quality alongside tips for using these tools most effectively, this book helps libraries of all kinds take a programmatic approach to measuring, analyzing, and improving library services.

  18. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    Research highlights: → The best estimate plus uncertainty methodology (BEPU) is one option in the licensing of nuclear reactors. → The challenges for extending the BEPU method for fuel qualification for an advanced reactor fuel are primarily driven by schedule, the need for data, and the sufficiency of the data. → In this paper we develop an extended BEPU methodology that can potentially be used to address these new challenges in the design and licensing of advanced nuclear reactors. → The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. → The methodology includes a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. - Abstract: Many evolving nuclear energy technologies use advanced predictive multiscale, multiphysics modeling and simulation (M and S) capabilities to reduce the cost and schedule of design and licensing. Historically, the role of experiments has been as a primary tool for the design and understanding of nuclear system behavior, while M and S played the subordinate role of supporting experiments. In the new era of multiscale, multiphysics computational-based technology development, this role has been reversed. The experiments will still be needed, but they will be performed at different scales to calibrate and validate the models leading to predictive simulations for design and licensing. Minimizing the required number of validation experiments produces cost and time savings. The use of multiscale, multiphysics models introduces challenges in validating these predictive tools - traditional methodologies will have to be modified to address these challenges. This paper gives the basic aspects of a methodology that can potentially be used to address these new challenges in

  19. THE UNCERTAINTIES OF ENVIRONMENT'S PARAMETERS MEASUREMENTS AS TOLLS OF THE MEASUREMENTS QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Miroslav Badida

    2008-06-01

    Full Text Available Identification of the noise measuring uncertainties by declared measured values is unconditionally necessary and required by legislative. Uncertainty of the measurements expresses all errors that accrue during the measuring. B y indication of uncertainties the measure documents that the objective value is with certain probability found in the interval that is bounded by the measurement uncertainty. The paper deals with the methodology of the uncertainty calculation by noise measurements in living and working environments. metal processing industry and building materials industry.

  20. Comparison of fungal spores concentrations measured with wideband integrated bioaerosol sensor and Hirst methodology

    Science.gov (United States)

    Fernández-Rodríguez, S.; Tormo-Molina, R.; Lemonis, N.; Clot, B.; O'Connor, D. J.; Sodeau, John R.

    2018-02-01

    The aim of this work was to provide both a comparison of traditional and novel methodologies for airborne spores detection (i.e. the Hirst Burkard trap and WIBS-4) and the first quantitative study of airborne fungal concentrations in Payerne (Western Switzerland) as well as their relation to meteorological parameters. From the traditional method -Hirst trap and microscope analysis-, sixty-three propagule types (spores, sporangia and hyphae) were identified and the average spore concentrations measured over the full period amounted to 4145 ± 263.0 spores/m3. Maximum values were reached on July 19th and on August 6th. Twenty-six spore types reached average levels above 10 spores/m3. Airborne fungal propagules in Payerne showed a clear seasonal pattern, increasing from low values in early spring to maxima in summer. Daily average concentrations above 5000 spores/m3 were almost constant in summer from mid-June onwards. Weather parameters showed a relevant role for determining the observed spore concentrations. Coniferous forest, dominant in the surroundings, may be a relevant source for airborne fungal propagules as their distribution and predominant wind directions are consistent with the origin. The comparison between the two methodologies used in this campaign showed remarkably consistent patterns throughout the campaign. A correlation coefficient of 0.9 (CI 0.76-0.96) was seen between the two over the time period for daily resolutions (Hirst trap and WIBS-4). This apparent co-linearity was seen to fall away once increased resolution was employed. However at higher resolutions upon removal of Cladosporium species from the total fungal concentrations (Hirst trap), an increased correlation coefficient was again noted between the two instruments (R = 0.81 with confidence intervals of 0.74 and 0.86).

  1. Site-conditions map for Portugal based on VS measurements: methodology and final model

    Science.gov (United States)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and

  2. A novel methodology for online measurement of thoron using Lucas scintillation cell

    International Nuclear Information System (INIS)

    Eappen, K.P.; Sapra, B.K.; Mayya, Y.S.

    2007-01-01

    The use of Lucas scintillation cell (LSC) technique for thoron estimation requires a modified methodology as opposed to radon estimation. While in the latter, the α counting is performed after a delay period varying between few hours to few days, in the case of thoron estimation the α counting has to be carried out immediately after sampling owing to the short half-life of thoron (55 s). This can be achieved best by having an on-line LSC sampling and counting system. However, half-life of the thoron decay product 212 Pb being 10.6 h, the background accumulates in LSC during online measurements and hence subsequent use of LSC is erroneous unless normal background level is achieved in the cell. This problem can be circumvented by correcting for the average background counts accumulated during the counting period which may be theoretically estimated. In this study, a methodology has been developed to estimate the true counts due to thoron. A linear regression between the counts obtained experimentally and the fractional decay in regular intervals of time is used to obtain the actual thoron concentration. The novelty of this approach is that the background of the cell is automatically estimated as the intercept of the regression graph. The results obtained by this technique compare well with the two filter method and the thoron concentration produced from a standard thoron source. However, the LSC as such cannot be used for environmental samples because the minimum detection level is comparable with that of thoron concentrations prevailing in normal atmosphere

  3. A methodology for supporting decisions on the establishment of protective measures after severe nuclear accidents. Final report

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Kollas, J.G.

    1994-06-01

    Full text: The objective of this report is to demonstrate the use of a methodology supporting decisions on protective measures following severe nuclear accidents. A multicriteria decision analysis approach is recommended where value tradeoffs are postponed until the very last stage of the decision process. Use of efficient frontiers is made to exclude all technically inferior solutions and present the decision maker with all non-dominated solutions. A choice among these solutions implies a value trade-off among the multiple criteria. An interactive computer package has been developed where the decision maker can choose a point on the efficient frontier in the consequence space and immediately see the alternative in the decision space resulting in the chosen consequences. The methodology is demonstrated through an application on the choice among possible protective measures in contaminated areas of the former USSR after the Chernobyl accident. Two distinct cases are considered: First a decision is to be made only on the basis of the level of soil contamination with Cs-137 and the total cost of the chosen protective policy; Next the decision is based on the geographic dimension of the contamination and the total cost. Three alternative countermeasure actions are considered for population segments living on soil contaminated at a certain level or in a specific geographic region: (a) relocation of the population; (b) improvement of the living conditions; and, (c) no countermeasures at all. This is the final deliverable of the CEC-CIS Joint Study Project 2, Task 5: Decision-Aiding-System for Establishing Intervention Levels, performed under Contracts COSU-CT91-0007 and COSU-CT92-0021 with the Commission of European Communities through CEPN. (author)

  4. Does Methodological Guidance Produce Consistency? A Review of Methodological Consistency in Breast Cancer Utility Value Measurement in NICE Single Technology Appraisals.

    Science.gov (United States)

    Rose, Micah; Rice, Stephen; Craig, Dawn

    2017-07-05

    Since 2004, National Institute for Health and Care Excellence (NICE) methodological guidance for technology appraisals has emphasised a strong preference for using the validated EuroQol 5-Dimensions (EQ-5D) quality-of-life instrument, measuring patient health status from patients or carers, and using the general public's preference-based valuation of different health states when assessing health benefits in economic evaluations. The aim of this study was to review all NICE single technology appraisals (STAs) for breast cancer treatments to explore consistency in the use of utility scores in light of NICE methodological guidance. A review of all published breast cancer STAs was undertaken using all publicly available STA documents for each included assessment. Utility scores were assessed for consistency with NICE-preferred methods and original data sources. Furthermore, academic assessment group work undertaken during the STA process was examined to evaluate the emphasis of NICE-preferred quality-of-life measurement methods. Twelve breast cancer STAs were identified, and many STAs used evidence that did not follow NICE's preferred utility score measurement methods. Recent STA submissions show companies using EQ-5D and mapping. Academic assessment groups rarely emphasized NICE-preferred methods, and queries about preferred methods were rare. While there appears to be a trend in recent STA submissions towards following NICE methodological guidance, historically STA guidance in breast cancer has generally not used NICE's preferred methods. Future STAs in breast cancer and reviews of older guidance should ensure that utility measurement methods are consistent with the NICE reference case to help produce consistent, equitable decision making.

  5. Measuring Effectiveness in Digital Game-Based Learning: A Methodological Review.

    Directory of Open Access Journals (Sweden)

    Anissa All

    2014-06-01

    Full Text Available In recent years, a growing number of studies are being conducted into the effectiveness of digital game-based learning (DGBL. Despite this growing interest, there is a lack of sound empirical evidence on the effectiveness of DGBL due to different outcome measures for assessing effectiveness, varying methods of data collection and inconclusive or difficult to interpret results. This has resulted in a need for an overarching methodology for assessing the effectiveness of DGBL. The present study took a first step in this direction by mapping current methods used for assessing the effectiveness of DGBL. Results showed that currently, comparison of results across studies and thus looking at effectiveness of DGBL on a more general level is problematic due to diversity in and suboptimal study designs. Variety in study design relates to three issues, namely different activities that are implemented in the control groups, different measures for assessing the effectiveness of DGBL and the use of different statistical techniques for analyzing learning outcomes. Suboptimal study designs are the result of variables confounding study results. Possible confounds that were brought forward in this review are elements that are added to the game as part of the educational intervention (e.g., required reading, debriefing session, instructor influences and practice effects when using the same test pre- and post-intervention. Lastly, incomplete information on the study design impedes replication of studies and thus falsification of study results.

  6. Characterization of gloss properties of differently treated polymer coating surfaces by surface clarity measurement methodology.

    Science.gov (United States)

    Gruber, Dieter P; Buder-Stroisznigg, Michael; Wallner, Gernot; Strauß, Bernhard; Jandel, Lothar; Lang, Reinhold W

    2012-07-10

    With one measurement configuration, existing gloss measurement methodologies are generally restricted to specific gloss levels. A newly developed image-analytical gloss parameter called "clarity" provides the possibility to describe the perceptual result of a broad range of different gloss levels with one setup. In order to analyze and finally monitor the perceived gloss of products, a fast and flexible method also for the automated inspection is highly demanded. The clarity parameter is very fast to calculate and therefore usable for fast in-line surface inspection. Coated metal specimens were deformed by varying degree and polished afterwards in order to study the clarity parameter regarding the quantification of varying surface gloss types and levels. In order to analyze the correlation with the human gloss perception a study was carried out in which experts were asked to assess gloss properties of a series of surface samples under standardized conditions. The study confirmed clarity to exhibit considerably better correlation to the human perception than alternative gloss parameters.

  7. Enabling Mobile Communications for the Needy: Affordability Methodology, and Approaches to Requalify Universal Service Measures

    Directory of Open Access Journals (Sweden)

    Louis-Francois PAU

    2009-01-01

    Full Text Available This paper links communications and media usage to social and household economics boundaries. It highlights that in present day society, communications and media are a necessity, but not always affordable, and that they furthermore open up for addictive behaviors which raise additional financial and social risks. A simple and efficient methodology compatible with state-of-the-art social and communications business statistics is developed, which produces the residual communications and media affordability budget and ultimately the value-at-risk in terms of usage and tariffs. Sensitivity analysis provides precious information on bottom-up communications and media adoption on the basis of affordability. This approach differs from the regulated but often ineffective Universal service obligation, which instead of catering for individual needs mostly addresses macro-measures helping geographical access coverage (e.g. in rural areas. It is proposed to requalify the Universal service obligations on operators into concrete measures, allowing, with unchanged funding, the needy to adopt mobile services based on their affordability constraints by bridging the gap to a standard tariff. Case data are surveyed from various countries. ICT policy recommendations are made to support widespread and socially responsible communications access.

  8. Methodological aspects to be considered in evaluating the economics of service measures

    International Nuclear Information System (INIS)

    Bald, M.

    1987-01-01

    For the purposes of the report, service measures is used as a term denoting all those steps which exceed the framework of normal in-service maintenance and repair and serve to improve economics over the normal case. Positive impacts are to be achieved on such parameters as availability, efficiency, and service life. One of the aspects investigated is the effect, if any, of such measures on the residual service life of plants in operation for a long period of time already. Residual service life in this case means the remaining span of effective technical and economic operation which, in these model calculations, also includes part of the period of depreciation. (orig.) [de

  9. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  10. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  11. Use of Balanced Scorecard Methodology for Performance Measurement of the Health Extension Program in Ethiopia.

    Science.gov (United States)

    Teklehaimanot, Hailay D; Teklehaimanot, Awash; Tedella, Aregawi A; Abdella, Mustofa

    2016-05-04

    In 2004, Ethiopia introduced a community-based Health Extension Program to deliver basic and essential health services. We developed a comprehensive performance scoring methodology to assess the performance of the program. A balanced scorecard with six domains and 32 indicators was developed. Data collected from 1,014 service providers, 433 health facilities, and 10,068 community members sampled from 298 villages were used to generate weighted national, regional, and agroecological zone scores for each indicator. The national median indicator scores ranged from 37% to 98% with poor performance in commodity availability, workforce motivation, referral linkage, infection prevention, and quality of care. Indicator scores showed significant difference by region (P < 0.001). Regional performance varied across indicators suggesting that each region had specific areas of strength and deficiency, with Tigray and the Southern Nations, Nationalities and Peoples Region being the best performers while the mainly pastoral regions of Gambela, Afar, and Benishangul-Gumuz were the worst. The findings of this study suggest the need for strategies aimed at improving specific elements of the program and its performance in specific regions to achieve quality and equitable health services. © The American Society of Tropical Medicine and Hygiene.

  12. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist

    NARCIS (Netherlands)

    Terwee, C.B.; Mokkink, L.B.; Knol, D.L.; Ostelo, R.W.J.G.; Bouter, L.M.; de Vet, H.C.W.

    2012-01-01

    Background: The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a

  13. Current psychometric and methodological issues in the measurement of overgeneral autobiographical memory.

    Science.gov (United States)

    Griffith, James W; Sumner, Jennifer A; Raes, Filip; Barnhofer, Thorsten; Debeer, Elise; Hermans, Dirk

    2012-12-01

    Autobiographical memory is a multifaceted construct that is related to psychopathology and other difficulties in functioning. Across many studies, a variety of methods have been used to study autobiographical memory. The relationship between overgeneral autobiographical memory (OGM) and psychopathology has been of particular interest, and many studies of this cognitive phenomenon rely on the Autobiographical Memory Test (AMT) to assess it. In this paper, we examine several methodological approaches to studying autobiographical memory, and focus primarily on methodological and psychometric considerations in OGM research. We pay particular attention to what is known about the reliability, validity, and methodological variations of the AMT. The AMT has adequate psychometric properties, but there is great variability in methodology across studies that use it. Methodological recommendations and suggestions for future studies are presented. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Measuring resource inequalities. The concepts and methodology for an area-based Gini coefficient

    International Nuclear Information System (INIS)

    Druckman, A.; Jackson, T.

    2008-01-01

    Although inequalities in income and expenditure are relatively well researched, comparatively little attention has been paid, to date, to inequalities in resource use. This is clearly a shortcoming when it comes to developing informed policies for sustainable consumption and social justice. This paper describes an indicator of inequality in resource use called the AR-Gini. The AR-Gini is an area-based measure of resource inequality that estimates inequalities between neighbourhoods with regard to the consumption of specific consumer goods. It is also capable of estimating inequalities in the emissions resulting from resource use, such as carbon dioxide emissions from energy use, and solid waste arisings from material resource use. The indicator is designed to be used as a basis for broadening the discussion concerning 'food deserts' to inequalities in other types of resource use. By estimating the AR-Gini for a wide range of goods and services we aim to enhance our understanding of resource inequalities and their drivers, identify which resources have highest inequalities, and to explore trends in inequalities. The paper describes the concepts underlying the construction of the AR-Gini and its methodology. Its use is illustrated by pilot applications (specifically, men's and boys' clothing, carpets, refrigerators/freezers and clothes washer/driers). The results illustrate that different levels of inequality are associated with different commodities. The paper concludes with a brief discussion of some possible policy implications of the AR-Gini. (author)

  15. Thermal-Diffusivity Measurements of Mexican Citrus Essential Oils Using Photoacoustic Methodology in the Transmission Configuration

    Science.gov (United States)

    Muñoz, G. A. López; González, R. F. López; López, J. A. Balderas; Martínez-Pérez, L.

    2011-05-01

    Photoacoustic methodology in the transmission configuration (PMTC) was used to study the thermophysical properties and their relation with the composition in Mexican citrus essential oils providing the viability of using photothermal techniques for quality control and for authentication of oils and their adulteration. Linear relations for the amplitude (on a semi-log scale) and phase, as functions of the sample's thickness, for the PMTC was obtained through a theoretical model fit to the experimental data for thermal-diffusivity measurements in Mexican orange, pink grapefruit, mandarin, lime type A, centrifuged essential oils, and Mexican distilled lime essential oil. Gas chromatography for distilled lime essential oil and centrifuged lime essential oil type A is reported to complement the study. Experimental results showed close thermal-diffusivity values between Mexican citrus essential oils obtained by centrifugation, but a significant difference of this physical property for distilled lime oil and the corresponding value obtained by centrifugation, which is due to their different chemical compositions involved with the extraction processes.

  16. A new and improved methodology for qualitative and quantitative mineralogical analysis of Boom Clay

    International Nuclear Information System (INIS)

    Zeelmaekers, E.; Vandenberghe, N.; Honty, M.; De Craen, M.; Derkowski, A.; Van Geet, M.

    2010-01-01

    Document available in extended abstract form only. A good knowledge of the mineralogy of any host formation studied for geological disposal of high-level radioactive waste, is a prerequisite for understanding the geochemical environment which will determine the migration and retention behaviour of radionuclides. In this respect, the Boom Clay mineralogical composition has been extensively studied last decades as reference host formation (e.g. ARCHIMEDEARGILE project, OECD-NEA clay catalogue report) with the aim to provide reliable data for a safety assessment. However, a comparison of the available literature data clearly showed a serious discrepancy among studies, not only in the quantitative, but also in the qualitative mineralogical composition of the Boom Clay (SAFIR II). The reason for such a huge disagreement could be related, among others, to variable grain size distributions of the studied samples (sample heterogeneity) and differences in the methodological approaches. In particular, the unambiguous characterisation of clay minerals and the quantification of mixed-layer phases appeared as an everlasting problem. This study is aimed at achieving a consensus on the qualitative and quantitative mineralogical data of the Boom Clay using the most advanced techniques currently available in the clay science. A new sampling campaign was performed in such a way that samples are (20 in total) more or less regularly distributed over Boom Clay Formation, ensuring that variations in the grain size distributions due to silty clay-clayey silt layers alternations are accounted for. The novel concept based on an analysis at two levels was applied: (1) bulk rock and (2) clay fraction analysis. (1) A bulk rock analysis consists of conventional XRD analysis with the identification of the principal mineral phases. As a next step, the bulk rock was mixed with a ZnO internal standard and experimental diffraction patterns of randomly oriented powders were analyzed using &apos

  17. Improved methodology for generation of axial flux shapes in digital core protection systems

    International Nuclear Information System (INIS)

    Lee, G.-C.; Baek, W.-P.; Chang, S.H.

    2002-01-01

    An improved method of axial flux shape (AFS) generation for digital core protection systems of pressurized water reactors is presented in this paper using an artificial neural network (ANN) technique - a feedforward network trained by backpropagation. It generates 20-node axial power shapes based on the information from three ex-core detectors. In developing the method, a total of 7173 axial flux shapes are generated from ROCS code simulation for training and testing of the ANN. The ANN trained 200 data predicts the remaining data with the average root mean square error of about 3%. The developed method is also tested with the real plant data measured during normal operation of Yonggwang Unit 4. The RMS errors in the range of 0.9∼2.1% are about twice as accurate as the cubic spline approximation method currently used in the plant. The developed method would contribute to solve the drawback of the current method as it shows reasonable accuracy over wide range of core conditions

  18. Improvement of methodological and data background for life cycle assessment of nano-metaloxides

    DEFF Research Database (Denmark)

    Miseljic, Mirko

    =0.01) to 3.20E-03 (801-1000 nm, α=1) PAF·m3 ·day/kg for Ag, TiO2, and ZnO ENMs, respectively. In terms of toxicity level the derived CFs show that Ag>ZnO>TiO2. The CFs can be applied, but should be considered interim. A LCA case study was performed on five ENM products, where novel industrial...... model was setup. The fate was based on peri-kinetic aggregation (Brownian motion), ortho-kinetic aggregation (fluid motion), differential settling (sedimentation), resuspension and dissolution of ENMs. The effect part was based on three freshwater trophic levels (algae, daphnia and fish...... measurements from products. Based on the review a central part of the improvement could be done by addressing the functional unit, data inventory and ENM freshwater ecotoxicity CFs. In order to derive freshwater (European continent) ecotoxicity CFs, at midpoint level, of metal (-oxide) ENMs a fate and effect...

  19. Numerical simulation and analysis of fuzzy PID and PSD control methodologies as dynamic energy efficiency measures

    International Nuclear Information System (INIS)

    Ardehali, M.M.; Saboori, M.; Teshnelab, M.

    2004-01-01

    Energy efficiency enhancement is achieved by utilizing control algorithms that reduce overshoots and undershoots as well as unnecessary fluctuations in the amount of energy input to energy consuming systems during transient operation periods. It is hypothesized that application of control methodologies with characteristics that change with time and according to the system dynamics, identified as dynamic energy efficiency measures (DEEM), achieves the desired enhancement. The objective of this study is to simulate and analyze the effects of fuzzy logic based tuning of proportional integral derivative (F-PID) and proportional sum derivative (F-PSD) controllers for a heating and cooling energy system while accounting for the dynamics of the major system components. The procedure to achieve the objective includes utilization of fuzzy logic rules to determine the PID and PSD controllers gain coefficients so that the control laws for regulating the heat exchangers heating or cooling energy inputs are determined in each time step of the operation period. The performances of the F-PID and F-PSD controllers are measured by means of two cost functions that are based on quadratic forms of the energy input and deviation from a set point temperature. It is found that application of the F-PID control algorithm, as a DEEM, results in lower costs for energy input and deviation from a set point temperature by 24% and 17% as compared to a PID and 13% and 8% as compared to a PSD, respectively. It is also shown that the F-PSD performance is better than that of the F-PID controller

  20. Improving timeliness and efficiency in the referral process for safety net providers: application of the Lean Six Sigma methodology.

    Science.gov (United States)

    Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A

    2010-01-01

    Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.

  1. A methodology to determine the level of automation to improve the production process and reduce the ergonomics index

    Science.gov (United States)

    Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo

    2017-08-01

    Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.

  2. A methodology for interpretation of overcoring stress measurements in anisotropic rock

    International Nuclear Information System (INIS)

    Hakala, M.; Sjoeberg, J.

    2006-11-01

    The in situ state of stress is an important parameter for the design of a repository for final disposal of spent nuclear fuel. This report presents work conducted to improve the quality of overcoring stress measurements, focused on the interpretation of overcoring rock stress measurements when accounting for possible anisotropic behavior of the rock. The work comprised: (i) development/upgrading of a computer code for calculating stresses from overcoring strains for anisotropic materials and for a general overcoring probe configuration (up to six strain rosettes with six gauges each), (ii) development of a computer code for determining elastic constants for transversely isotropic rocks from biaxial testing, and (iii) analysis of case studies of selected overcoring measurements in both isotropic and anisotropic rocks from the Posiva and SKB sites in Finland and Sweden, respectively. The work was principally limited to transversely isotropic materials, although the stress calculation code is applicable also to orthotropic materials. The developed computer codes have been geared to work primarily with the Borre and CSIRO HI three-dimensional overcoring measurement probes. Application of the codes to selected case studies, showed that the developed tools were practical and useful for interpreting overcoring stress measurements conducted in anisotropic rock. A quantitative assessment of the effects of anisotropy may thus be obtained, which provides increased reliability in the stress data. Potential gaps in existing data and/or understanding can also be identified. (orig.)

  3. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia.

    Science.gov (United States)

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  4. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia

    Science.gov (United States)

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  5. Single Case Method in Psychology: How to Improve as a Possible Methodology in Quantitative Research.

    Science.gov (United States)

    Krause-Kjær, Elisa; Nedergaard, Jensine I

    2015-09-01

    Awareness of including Single-Case Method (SCM), as a possible methodology in quantitative research in the field of psychology, has been argued as useful, e.g., by Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Their article introduces a historical and conceptual analysis of SCMs and proposes changing the, often prevailing, tendency of neglecting SCM as an alternative to Null Hypothesis Significance Testing (NHST). This article contributes by putting a new light on SCM as an equally important methodology in psychology. The intention of the present article is to elaborate this point of view further by discussing one of the most fundamental requirements as well as main characteristics of SCM regarding temporality. In this respect that; "…performance is assessed continuously over time and under different conditions…" Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Defining principles when it comes to particular units of analysis, both synchronic (spatial) and diachronic (temporal) elements should be incorporated. In this article misunderstandings of the SCM will be adduced, and further the temporality will be described in order to propose how the SCM could have a more severe usability in psychological research. It is further discussed how to implement SCM in psychological methodology. It is suggested that one solution might be to reconsider the notion of time in psychological research to cover more than a variable of control and in this respect also include the notion of time as an irreversible unity within life.

  6. Integration of Value Stream Map and Healthcare Failure Mode and Effect Analysis into Six Sigma Methodology to Improve Process of Surgical Specimen Handling

    Directory of Open Access Journals (Sweden)

    Sheng-Hui Hung

    2015-01-01

    Full Text Available Specimen handling is a critical patient safety issue. Problematic handling process, such as misidentification (of patients, surgical site, and specimen counts, specimen loss, or improper specimen preparation can lead to serious patient harms and lawsuits. Value stream map (VSM is a tool used to find out non-value-added works, enhance the quality, and reduce the cost of the studied process. On the other hand, healthcare failure mode and effect analysis (HFMEA is now frequently employed to avoid possible medication errors in healthcare process. Both of them have a goal similar to Six Sigma methodology for process improvement. This study proposes a model that integrates VSM and HFMEA into the framework, which mainly consists of define, measure, analyze, improve, and control (DMAIC, of Six Sigma. A Six Sigma project for improving the process of surgical specimen handling in a hospital was conducted to demonstrate the effectiveness of the proposed model.

  7. Integration of Value Stream Map and Healthcare Failure Mode and Effect Analysis into Six Sigma Methodology to Improve Process of Surgical Specimen Handling.

    Science.gov (United States)

    Hung, Sheng-Hui; Wang, Pa-Chun; Lin, Hung-Chun; Chen, Hung-Ying; Su, Chao-Ton

    2015-01-01

    Specimen handling is a critical patient safety issue. Problematic handling process, such as misidentification (of patients, surgical site, and specimen counts), specimen loss, or improper specimen preparation can lead to serious patient harms and lawsuits. Value stream map (VSM) is a tool used to find out non-value-added works, enhance the quality, and reduce the cost of the studied process. On the other hand, healthcare failure mode and effect analysis (HFMEA) is now frequently employed to avoid possible medication errors in healthcare process. Both of them have a goal similar to Six Sigma methodology for process improvement. This study proposes a model that integrates VSM and HFMEA into the framework, which mainly consists of define, measure, analyze, improve, and control (DMAIC), of Six Sigma. A Six Sigma project for improving the process of surgical specimen handling in a hospital was conducted to demonstrate the effectiveness of the proposed model.

  8. Improvement in Product Development: Use of back-end data to support upstream efforts of Robust Design Methodology

    Directory of Open Access Journals (Sweden)

    Vanajah Siva

    2012-12-01

    Full Text Available In the area of Robust Design Methodology (RDM less is done on how to use and work with data from the back-end of the product development process to support upstream improvement. The purpose of this paper is to suggest RDM practices for the use of customer claims data in early design phases as a basis for improvements. The back-end data, when systematically analyzed and fed back into the product development process, aids in closing the product development loop from claims to improvement in the design phase. This is proposed through a flow of claims data analysis tied to an existing tool, namely Failure Mode and Effects Analysis (FMEA. The systematic and integrated analysis of back-end data is suggested as an upstream effort of RDM to increase understanding of noise factors during product usage based on the feedback of claims data to FMEA and to address continuous improvement in product development.

  9. Developing the Service Template: From measurement to agendas for improvement

    OpenAIRE

    Williams, CS; Saunders, M

    2007-01-01

    Traditional survey based measures of service quality are argued to be problematic when reflecting individual services and turning measurement into action. This paper reviews developments to an alternative measurement approach, the Service Template Process and offers an extension to it. The extended process appears able to measure service users’ and deliverers’ perceptions of service quality independently. It also enables participants to jointly agree an agenda for quality improvement. The e...

  10. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    Science.gov (United States)

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information

  11. The Holistic Targeting (HOT) methodology as the means to improve Information Operations (IO) target development and prioritization

    OpenAIRE

    Ieva, Christopher S.

    2008-01-01

    Prioritization. In response to this challenge, this study proposes five recommendations to enhance IO integration into the Joint Targeting Cycle: the use of interim IO Joint Munitions Effectiveness Manual (JMEM) techniques to better forecast cognitive effects, the adoption of the Measure of Worth (MOW) model to assess IO effects, the HOT methodology to develop and prioritize IO targets, the use of compendium software facilitate targeting problem understanding and the network analysis to...

  12. Methodologies for the measurement of bone density and their precision and accuracy

    International Nuclear Information System (INIS)

    Goodwin, P.N.

    1987-01-01

    Radiographic methods of determining bone density have been available for many years, but recently most of the efforts in this field have focused on the development of instruments which would accurately and automatically measure bone density by absorption, or by the use of x-ray computed tomography (CT). Single energy absorptiometers using I-125 have been available for some years, and have been used primarily for measurements on the radius, although recently equipment for measuring the os calcis has become available. Accuracy of single energy measurements is about 3% to 5%; precision, which has been poor because of the difficulty of exact repositioning, has recently been improved by automatic methods so that it now approaches 1% or better. Dual energy sources offer the advantages of greater accuracy and the ability to measure the spine and other large bones. A number of dual energy scanners are now on the market, mostly using gadolinium-153 as a source. Dual energy scanning is capable of an accuracy of a few percent, but the precision when scanning patients can vary widely, due to the difficulty of comparing exactly the same areas; 2 to 4% would appear to be typical. Quantitative computed tomography (QCT) can be used to directly measure the trabecular bone within the vertebral body. The accuracy of single-energy QCT is affected by the amount of marrow fat present, which can lead to underestimations of 10% or more. An increase in marrow fat would cause an apparent decrease in bone mineral. However, the precision can be quite good, 1% or 2% on phantoms, and nearly as good on patients when four vertebrae are averaged. Dual energy scanning can correct for the presence of fat, but is less precise, and not available on all CT units. 52 references

  13. A simple and efficient methodology to improve geometric accuracy in gamma knife radiation surgery: implementation in multiple brain metastases.

    Science.gov (United States)

    Karaiskos, Pantelis; Moutsatsos, Argyris; Pappas, Eleftherios; Georgiou, Evangelos; Roussakis, Arkadios; Torrens, Michael; Seimenis, Ioannis

    2014-12-01

    To propose, verify, and implement a simple and efficient methodology for the improvement of total geometric accuracy in multiple brain metastases gamma knife (GK) radiation surgery. The proposed methodology exploits the directional dependence of magnetic resonance imaging (MRI)-related spatial distortions stemming from background field inhomogeneities, also known as sequence-dependent distortions, with respect to the read-gradient polarity during MRI acquisition. First, an extra MRI pulse sequence is acquired with the same imaging parameters as those used for routine patient imaging, aside from a reversal in the read-gradient polarity. Then, "average" image data are compounded from data acquired from the 2 MRI sequences and are used for treatment planning purposes. The method was applied and verified in a polymer gel phantom irradiated with multiple shots in an extended region of the GK stereotactic space. Its clinical impact in dose delivery accuracy was assessed in 15 patients with a total of 96 relatively small (series. Due to these uncertainties, a considerable underdosage (5%-32% of the prescription dose) was found in 33% of the studied targets. The proposed methodology is simple and straightforward in its implementation. Regarding multiple brain metastases applications, the suggested approach may substantially improve total GK dose delivery accuracy in smaller, outlying targets. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Study to improve recriticality evaluation methodology after severe accident (Joint research)

    International Nuclear Information System (INIS)

    Kugo, Teruhiko; Ishikawa, Makoto; Nagaya, Yasunobu; Yokoyama, Kenji; Fukaya, Yuji; Maruyama, Hiromi; Kondo, Takao; Minato, Hirokazu; Tsuchiya, Akiyuki; Ishii, Yoshihiko; Fujimura, Koji

    2014-03-01

    The present report summarizes the results of a 2-year cooperative study between JAEA and Hitachi-GE Nuclear Energy in order to contribute to the settlement of the Fukushima-Daiich nuclear power plants which suffered from the severe accident on March 2011. In the Fukushima-Daiich plants, it is considered that the nuclear fuel in the core was melted down by the loss of ultimate heat sink by the station black out after the 2011 off the Pacific coast of Tohoku Earthquake. The position and/or the mass of the melted fuel have not been known yet. Therefore, the possibility of recriticality events and the severity of the postulated recriticality are uncertain. In the present study, the possible scenarios to reach the recriticality events in Fukushima-Daiich were investigated first. Then, the analytical methodology to evaluate the time-dependent recriticality events has been developed by modelling the reactivity insertion rate and the possible feedback according to the recriticality scenarios identified in the first step. The methodology developed here has been equipped as a transient simulation tool, PORCAS, which is operated on a multi-purpose platform for reactor analysis, MARBLE. Finally, the radiation exposure rates by the postulated recriticality events in Fukushima-Daiich were approximately evaluated to estimate the impact to the public environment. (author)

  15. Net ecosystem carbon dioxide exchange in tropical rainforests - sensitivity to environmental drivers and flux measurement methodology

    Science.gov (United States)

    Fu, Z.; Stoy, P. C.

    2017-12-01

    Tropical rainforests play a central role in the Earth system services of carbon metabolism, climate regulation, biodiversity maintenance, and more. They are under threat by direct anthropogenic effects including deforestation and indirect anthropogenic effects including climate change. A synthesis of the factors that determine the net ecosystem exchange of carbon dioxide (NEE) across multiple time scales in different tropical rainforests has not been undertaken to date. Here, we study NEE and its components, gross primary productivity (GPP) and ecosystem respiration (RE), across thirteen tropical rainforest research sites with 63 total site-years of eddy covariance data. Results reveal that the five ecosystems that have greater carbon uptakes (with the magnitude of GPP greater than 3000 g C m-2 y-1) sequester less carbon - or even lose it - on an annual basis at the ecosystem scale. This counterintuitive result is because high GPP is compensated by similar magnitudes of RE. Sites that provided subcanopy CO2 storage observations had higher average magnitudes of GPP and RE and consequently lower NEE, highlighting the importance of measurement methodology for understanding carbon dynamics in tropical rainforests. Vapor pressure deficit (VPD) constrained GPP at all sites, but to differing degrees. Many environmental variables are significantly related to NEE at time scales greater than one year, and NEE at a rainforest in Malaysia is significantly related to soil moisture variability at seasonal time scales. Climate projections from 13 general circulation models (CMIP5) under representative concentration pathway (RCP) 8.5 suggest that many current tropical rainforest sites on the cooler end of the current temperature range are likely to reach a climate space similar to present-day warmer sites by the year 2050, and warmer sites will reach a climate space not currently experienced. Results demonstrate the need to quantify if mature tropical trees acclimate to heat and

  16. Improving Quality Of Spectrum Measurement By Event - Event Coincidence Technique

    International Nuclear Information System (INIS)

    Pham Dinh Khang; Doan Trong Thu; Nguyen Duc Hoa; Nguyen An Son; Nguyen Xuan Hai; Ho Huu Thang

    2011-01-01

    To improve the quality of measurement data for the research levels density and gamma strength function in intermediate energy region below the neutron binding energy (B n ), a new method was developed at the Dalat Nuclear Research Institute. This method improve the ratio of the count of peak per compton background more times. This results are evaluated, compared with other methods. (author)

  17. Using Lean Six Sigma Methodology to Improve a Mass Immunizations Process at the United States Naval Academy.

    Science.gov (United States)

    Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R

    2016-06-01

    Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  18. Clinical Performance Measures and Quality Improvement System Considerations for Dental Education.

    Science.gov (United States)

    Parkinson, Joseph W; Zeller, Gregory G

    2017-03-01

    Quality improvement and quality assurance programs are an integral part of providing excellence in health care delivery. The Dental Quality Alliance and the Commission on Dental Accreditation recognize this and have created standards and recommendations to advise health care providers and health care delivery systems, including dental schools, on measuring the quality of the care delivered to patients. Overall health care expenditures have increased, and the Affordable Care Act has made health care, including dentistry, available to more people in the United States. These increases in cost and in the number of patients accessing care contribute to a heightened interest in measurable quality improvement outcomes that reflect efficiency, effectiveness, and overall value. Practitioners and administrators, both in academia and in the "real world," need an understanding of various quality improvement methodologies available in order to select approaches that support effective monitoring of the quality of care delivered. This article compares and contrasts various quality improvement approaches, programs, and systems currently in use in order to assist dental providers and administrators in choosing quality improvement methodologies pertinent to their practice or institution.

  19. The impact of multi-criteria performance measurement on business performance improvement

    Directory of Open Access Journals (Sweden)

    Fentahun Moges Kasie

    2013-06-01

    Full Text Available Purpose: The purpose of this paper is to investigate the relationship between multi-criteria performance measurement (MCPM practice and business performance improvement using the raw data collected from 33 selected manufacturing companies. In addition, it proposes modified MCPM model as an effective approach to improve business performance of manufacturing companies. Design/methodology/approach:Research paper. Primary and secondary data were collected using questionnaire survey, interview and observation of records. The methodology is to evaluate business performances of sampled manufacturing companies and the extent of utilization of crucial non-financial (lagging and non-financial (leading performance measures. The positive correlation between financial business performance and practice of MCPM is clearly shown using Pearson’s correlation coefficient analysis. Findings –This research paper indicates that companies which measure their performance using important financial and non-financial measures achieve better business performance. Even though certain companies are currently using non-financial measures, the researchers have learned that these financial measures were not integrated with each other, financial measures and strategic objectives. Research limitations/implications: The limitation of this paper is that the number of surveyed companies is small to make generalization and they are found in a single country. Further researches which incorporate a large number of companies from various developing nations are suggested to minimize the limitation of this research.Practical Implication: The paper shows that multi-dimensional performance measures with the inclusion of key leading indicator are essential to predict the future environment. But cost-accounting based financial measures are inadequate to do so. These are shown practically using Pearson’s correlation coefficient analysis. Originality/value: The significance of multi

  20. Contribution for the improvement of pressurized thermal shock assessment methodologies in PWR pressure vessels

    International Nuclear Information System (INIS)

    Gomes, Paulo de Tarso Vida

    2005-01-01

    The structural integrity assessment of nuclear reactor pressure vessel, concerned to Pressurized Thermal Shock (PTS) accidents, became a necessity and has been investigated since the eighty's. The recognition of the importance of PTS assessment has led the international nuclear technology community to devote a considerable research effort directed to the complete integrity assessment process of the Reactor Pressure Vessels (VPR). Researchers in Europe, Japan and U.S.A. have concentrated efforts in the VPR structural and fracture analysis, conducting experiments to best understand how specific factors act on the behavior of discontinuities, under PTS loading conditions. The main goal of this work is to study de structural behavior of an 'in scale' PWR nuclear reactor pressure vessel model, containing actual discontinuities, under loading conditions generated by a pressurized thermal shock. To construct the pressure vessel model utilized in this research, the approach developed by Barroso (1995) and based on likelihood studies, related to thermal-hydraulic behavior during the PTS was employed. To achieve the objective of this research, a new methodology to generate cracks, with known geometry and localization in the vessel model wall was developed. Additionally, an hydraulic circuit, able to flood the vessel model, heated to 300 deg C, with 10 m 3 of water at 8 deg C, in 170 seconds, was built. Thermo-hydraulic calculations using RELAP5/M0D 3.2.2γ computational code were done, to estimate the temperature profiles during the cooling time. The resulting data subsidized the thermo-structural calculations that were accomplished using ANSYS 7.01 computational code, for both 2D and 3D models. So, the stress profiles obtained with these calculations were associated with fracture mechanics concepts, to assess the crack growth behavior in the VPR model wall. After the PTS test, the VPR model was submitted to destructive and non-destructive inspections. The results

  1. USING A NEW SUPPLY CHAIN PLANNING METHODOLOGY TO IMPROVE SUPPLY CHAIN EFFICIENCY

    Directory of Open Access Journals (Sweden)

    A.L.V. Raubenheimer

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Effective supply chain planning and management has emerged as one of the most challenging opportunities for companies in the global economy during the last decade or two. This article reviews the evolution of Supply Chain Management and the traditional Supply Chain Solutions. It then introduces a new Supply Chain Planning methodology in which simulation modelling plays an important value-adding role to help organisations understand the dynamics of their Supply Chains.

    AFRIKAANSE OPSOMMING:Effektiewe voorsieningskettingbeplanning en –bestuur het gedurende die laaste twee dekades ontwikkel tot een van die mees uitdagende geleenthede vir ondernemings in die wêreldekonomie. Hierdie artikel hersien kortliks die ontwikkeling van voorsieningskettingbestuur en die tradisionele oplossings. ‘n Nuwe voorsieningskettingbeplanningsmetodologie word dan voorgestel en bespreek waarin simulasiemodellering ‘n belangrike rol speel om ondernemings te help om die dinamika van hul voorsieningskettings te begryp.

  2. Improving life cycle assessment methodology for the application of decision support

    DEFF Research Database (Denmark)

    Herrmann, Ivan Tengbjerg

    for the application of decision support and evaluation of uncertainty in LCA. From a decision maker’s (DM’s) point of view there are at least three main “illness” factors influencing the quality of the information that the DM uses for making decisions. The factors are not independent of each other, but it seems......) refrain from making a decision based on an LCA and thus support a decision on other parameters than the LCA environmental parameters. Conversely, it may in some decision support contexts be acceptable to base a decision on highly uncertain information. This all depends on the specific decision support...... the different steps. A deterioration of the quality in each step is likely to accumulate through the statistical value chain in terms of increased uncertainty and bias. Ultimately this can make final decision support problematic. The "Law of large numbers" (LLN) is the methodological tool/probability theory...

  3. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist.

    Science.gov (United States)

    Terwee, Caroline B; Mokkink, Lidwine B; Knol, Dirk L; Ostelo, Raymond W J G; Bouter, Lex M; de Vet, Henrica C W

    2012-05-01

    The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. The scoring system was developed based on discussions among experts and testing of the scoring system on 46 articles from a systematic review. Four response options were defined for each COSMIN item (excellent, good, fair, and poor). A quality score per measurement property is obtained by taking the lowest rating of any item in a box ("worst score counts"). Specific criteria for excellent, good, fair, and poor quality for each COSMIN item are described. In defining the criteria, the "worst score counts" algorithm was taken into consideration. This means that only fatal flaws were defined as poor quality. The scores of the 46 articles show how the scoring system can be used to provide an overview of the methodological quality of studies included in a systematic review of measurement properties. Based on experience in testing this scoring system on 46 articles, the COSMIN checklist with the proposed scoring system seems to be a useful tool for assessing the methodological quality of studies included in systematic reviews of measurement properties.

  4. Updating and improving methodology for prioritizing highway project locations on the strategic intermodal system (SIS).

    Science.gov (United States)

    2016-04-01

    The Florida Department of Transportation (FDOT) District One developed the Congestion Management Process : (CMP) system to prioritize low-cost, near-term highway improvements on the Strategic Intermodal System (SIS). : The existing CMP system is desi...

  5. Impact of lean six sigma process improvement methodology on cardiac catheterization laboratory efficiency.

    Science.gov (United States)

    Agarwal, Shikhar; Gallo, Justin J; Parashar, Akhil; Agarwal, Kanika K; Ellis, Stephen G; Khot, Umesh N; Spooner, Robin; Murat Tuzcu, Emin; Kapadia, Samir R

    2016-03-01

    Operational inefficiencies are ubiquitous in several healthcare processes. To improve the operational efficiency of our catheterization laboratory (Cath Lab), we implemented a lean six sigma process improvement initiative, starting in June 2010. We aimed to study the impact of lean six sigma implementation on improving the efficiency and the patient throughput in our Cath Lab. All elective and urgent cardiac catheterization procedures including diagnostic coronary angiography, percutaneous coronary interventions, structural interventions and peripheral interventions performed between June 2009 and December 2012 were included in the study. Performance metrics utilized for analysis included turn-time, physician downtime, on-time patient arrival, on-time physician arrival, on-time start and manual sheath-pulls inside the Cath Lab. After implementation of lean six sigma in the Cath Lab, we observed a significant improvement in turn-time, physician downtime, on-time patient arrival, on-time physician arrival, on-time start as well as sheath-pulls inside the Cath Lab. The percentage of cases with optimal turn-time increased from 43.6% in 2009 to 56.6% in 2012 (p-trendprocess improvement initiative, lean six sigma, on improving and sustaining efficiency of our Cath Lab operation. After the successful implementation of this continuous quality improvement initiative, there was a significant improvement in the selected performance metrics namely turn-time, physician downtime, on-time patient arrival, on-time physician arrival, on-time start as well as sheath-pulls inside the Cath Lab. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. MODERN CONCEPTS OF THE SIX SIGMA METHODOLOGY FOR IMPROVING THE QUALITY

    Directory of Open Access Journals (Sweden)

    MARGARITA JANESKA

    2018-02-01

    Full Text Available Product quality is generally accepted as being crucial in today’s industrial business. The traditional aspects of product quality are connected to product design (translating customer demands into attractive features and technical specifications and to the design and specification of high performance production processes with low defect rates. Quality management is the general expression for all actions leading to quality. Quality management is focused on improving customer satisfaction through continuous improvement of processes including the removal of uncertain activities, and continuous improvement of the quality of processes, products and services. The quality management includes four key processes, such as quality planning, quality assurance, quality control and quality costs. The main accent in this paper will be on quality control and the application of one of the quality control tools in order to improve it. Six Sigma is different from other quality improvement concepts in that its framework is comprised of many principles, tools and techniques, which, together with experience, are all integrated and translated into best practices. Bearing in mind that the goal of every company is to work effectively and effectively in the long run, this paper focuses on Six Sigma as a way to continuously improve quality. Namely, this paper emphasizes the key features of the quality of products / services, the Need for the application of Six Sigma for quality assurance, and also a detailed list of tools and techniques that can be used during the implementation of Six Sigma.

  7. The Value of Improved Measurements in a Pig Slaughterhouse

    DEFF Research Database (Denmark)

    Kjærsgaard, Niels Christian

    The pig industry is an essential and important part of Danish economy with an export value in 2006 of more than DKK 28 billions [Danish Meat Association (2007)]. The competition is hard, and potential new competitors from low cost countries can be expected to enter the traditional Danish export...... markets. Therefore it is more important than ever to optimize all aspects of Danish pig production, slaughtering processes and delivery. This paper concerns the aspects of optimization at the slaughterhouses regarding estimation of the value of improved measurements. The slaughterhouse industry differs...... investments are expected to improve the quality of the measurements further. This paper concerns the use of Operations Research to solve a practical problem, which is of major importance for the industry, namely to improve the estimation of the economic effects of improved measurements. The benefit...

  8. Methodological considerations for global analysis of cellular FLIM/FRET measurements

    Science.gov (United States)

    Adbul Rahim, Nur Aida; Pelet, Serge; Kamm, Roger D.; So, Peter T. C.

    2012-02-01

    Global algorithms can improve the analysis of fluorescence energy transfer (FRET) measurement based on fluorescence lifetime microscopy. However, global analysis of FRET data is also susceptible to experimental artifacts. This work examines several common artifacts and suggests remedial experimental protocols. Specifically, we examined the accuracy of different methods for instrument response extraction and propose an adaptive method based on the mean lifetime of fluorescent proteins. We further examined the effects of image segmentation and a priori constraints on the accuracy of lifetime extraction. Methods to test the applicability of global analysis on cellular data are proposed and demonstrated. The accuracy of global fitting degrades with lower photon count. By systematically tracking the effect of the minimum photon count on lifetime and FRET prefactors when carrying out global analysis, we demonstrate a correction procedure to recover the correct FRET parameters, allowing us to obtain protein interaction information even in dim cellular regions with photon counts as low as 100 per decay curve.

  9. Improved density measurement by FIR laser interferometer on EAST tokamak

    International Nuclear Information System (INIS)

    Shen, Jie; Jie, Yinxian; Liu, Haiqing; Wei, Xuechao; Wang, Zhengxing; Gao, Xiang

    2013-01-01

    Highlights: • In 2012, the water-cooling Mo wall was installed in EAST. • A schottky barrier diode detector is designed and used on EAST for the first time. • The three-channel far-infrared laser interferometer can measure the electron density. • The improved measurement and latest experiment results are reported. • The signal we get in this experiment campaign is much better than we got in 2010. -- Abstract: A three-channel far-infrared (FIR) hydrogen cyanide (HCN) laser interferometer is in operation since 2010 to measure the line averaged electron density on experimental advanced superconducting tokamak (EAST). The HCN laser signal is improved by means of a new schottky barrier diode (SBD) detector. The improved measurement and latest experiment results of the three-channel FIR laser interferometer on EAST tokamak are reported

  10. Improved density measurement by FIR laser interferometer on EAST tokamak

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Jie, E-mail: shenjie1988@ipp.ac.cn; Jie, Yinxian; Liu, Haiqing; Wei, Xuechao; Wang, Zhengxing; Gao, Xiang

    2013-11-15

    Highlights: • In 2012, the water-cooling Mo wall was installed in EAST. • A schottky barrier diode detector is designed and used on EAST for the first time. • The three-channel far-infrared laser interferometer can measure the electron density. • The improved measurement and latest experiment results are reported. • The signal we get in this experiment campaign is much better than we got in 2010. -- Abstract: A three-channel far-infrared (FIR) hydrogen cyanide (HCN) laser interferometer is in operation since 2010 to measure the line averaged electron density on experimental advanced superconducting tokamak (EAST). The HCN laser signal is improved by means of a new schottky barrier diode (SBD) detector. The improved measurement and latest experiment results of the three-channel FIR laser interferometer on EAST tokamak are reported.

  11. Mercury methylation and reduction potentials in marine water: An improved methodology using {sup 197}Hg radiotracer

    Energy Technology Data Exchange (ETDEWEB)

    Koron, Neza [National Institute of Biology, Marine Biology Station, Fornace 41, 6330 Piran (Slovenia); Bratkic, Arne [Department of Environmental Sciences, ' Jozef Stefan' Institute, Jamova 39, 1000 Ljubljana (Slovenia); Ribeiro Guevara, Sergio, E-mail: ribeiro@cab.cnea.gov.ar [Laboratorio de Analisis por Activacion Neutronica, Centro Atomico Bariloche, Av. Bustillo km 9.5, 8400 Bariloche (Argentina); Vahcic, Mitja; Horvat, Milena [Department of Environmental Sciences, ' Jozef Stefan' Institute, Jamova 39, 1000 Ljubljana (Slovenia)

    2012-01-15

    A highly sensitive laboratory methodology for simultaneous determination of methylation and reduction of spiked inorganic mercury (Hg{sup 2+}) in marine water labelled with high specific activity radiotracer ({sup 197}Hg prepared from enriched {sup 196}Hg stable isotope) was developed. A conventional extraction protocol for methylmercury (CH{sub 3}Hg{sup +}) was modified in order to significantly reduce the partitioning of interfering labelled Hg{sup 2+} into the final extract, thus allowing the detection of as little as 0.1% of the Hg{sup 2+} spike transformed to labelled CH{sub 3}Hg{sup +}. The efficiency of the modified CH{sub 3}Hg{sup +} extraction procedure was assessed by radiolabelled CH{sub 3}Hg{sup +} spikes corresponding to concentrations of methylmercury between 0.05 and 4 ng L{sup -1}. The recoveries were 73.0{+-}6.0% and 77.5{+-}3.9% for marine and MilliQ water, respectively. The reduction potential was assessed by purging and trapping the radiolabelled elemental Hg in a permanganate solution. The method allows detection of the reduction of as little as 0.001% of labelled Hg{sup 2+} spiked to natural waters. To our knowledge, the optimised methodology is among the most sensitive available to study the Hg methylation and reduction potential, therefore allowing experiments to be done at spikes close to natural levels (1-10 ng L{sup -1}). - Highlights: Black-Right-Pointing-Pointer Inorganic mercury methylation and reduction in marine water were studied. Black-Right-Pointing-Pointer High specific activity {sup 197}Hg was used to label Hg{sup 2+} spikes at natural levels. Black-Right-Pointing-Pointer Methylmercury extraction had 73% efficiency for 0.05-4 ng L{sup -1} levels. Black-Right-Pointing-Pointer High sensibility to assess methylation potentials, below 0.1% of the spike. Black-Right-Pointing-Pointer High sensibility also for reduction potentials, as low as 0.001% of the spike.

  12. Validity and reliability of using photography for measuring knee range of motion: a methodological study

    Directory of Open Access Journals (Sweden)

    Adie Sam

    2011-04-01

    Full Text Available Abstract Background The clinimetric properties of knee goniometry are essential to appreciate in light of its extensive use in the orthopaedic and rehabilitative communities. Intra-observer reliability is thought to be satisfactory, but the validity and inter-rater reliability of knee goniometry often demonstrate unacceptable levels of variation. This study tests the validity and reliability of measuring knee range of motion using goniometry and photographic records. Methods Design: Methodology study assessing the validity and reliability of one method ('Marker Method' which uses a skin marker over the greater trochanter and another method ('Line of Femur Method' which requires estimation of the line of femur. Setting: Radiology and orthopaedic departments of two teaching hospitals. Participants: 31 volunteers (13 arthritic and 18 healthy subjects. Knee range of motion was measured radiographically and photographically using a goniometer. Three assessors were assessed for reliability and validity. Main outcomes: Agreement between methods and within raters was assessed using concordance correlation coefficient (CCCs. Agreement between raters was assessed using intra-class correlation coefficients (ICCs. 95% limits of agreement for the mean difference for all paired comparisons were computed. Results Validity (referenced to radiographs: Each method for all 3 raters yielded very high CCCs for flexion (0.975 to 0.988, and moderate to substantial CCCs for extension angles (0.478 to 0.678. The mean differences and 95% limits of agreement were narrower for flexion than they were for extension. Intra-rater reliability: For flexion and extension, very high CCCs were attained for all 3 raters for both methods with slightly greater CCCs seen for flexion (CCCs varied from 0.981 to 0.998. Inter-rater reliability: For both methods, very high ICCs (min to max: 0.891 to 0.995 were obtained for flexion and extension. Slightly higher coefficients were obtained

  13. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios.

    Science.gov (United States)

    Schaafstra, F J W C; van Doorn, D A; Schonewille, J T; van Riet, M M J; Visser, P; Blok, M C; Hendriks, W H

    2017-11-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE) basis) diets (i.e. 22 MJ NE/day) with increasing proportions of a pelleted concentrate (C) in relation to haylage (H). The absolute amounts of diet dry matter fed per day were 4.48 kg of H (100H), 3.36 and 0.73 kg of H and C (75H25C), 2.24 and 1.45 kg of H and C (50H50C) and 1.12 and 2.17 kg of H and C (25H75C). Diets were supplemented with minerals, vitamins and TiO2 (3.7 g Ti/day). Voluntary voided faeces were quantitatively collected daily during 10 consecutive days and analysed for moisture, ash, ADL, acid-insoluble ash (AIA) and Ti. A minimum faeces collection period of 6 consecutive days, along with a 14-day period to adapt the animals to the diets and become accustomed to the collection procedure, is recommended to obtain accurate estimations on dry matter digestibility and organic matter digestibility (OMD) in equids fed haylage-based diets supplemented with concentrate. In addition, the recovery of AIA, ADL and Ti was determined and evaluated. Mean faecal recovery over 10 consecutive days across diets for AIA, ADL and Ti was 124.9% (SEM 2.9), 108.7% (SEM 2.0) and 97.5% (SEM 0.9), respectively. Cumulative faecal recovery of AIA significantly differed between treatments, indicating that AIA is inadequate to estimate the OMD in equines. In addition, evaluation of the CV of mean cumulative faecal recoveries obtained by AIA, ADL and Ti showed greater variations in faecal excretion of AIA (9.1) and ADL (7.4) than Ti (3.7). The accuracy of prediction of OMD was higher with the use of Ti than ADL. The use of Ti is preferred as a marker in digestibility trials in equines fed haylage-based diets supplemented with increasing amounts of pelleted concentrate.

  14. Application of kaizen methodology to foster departmental engagement in quality improvement.

    Science.gov (United States)

    Knechtges, Paul; Decker, Michael Christopher

    2014-12-01

    The Toyota Production System, also known as Lean, is a structured approach to continuous quality improvement that has been developed over the past 50 years to transform the automotive manufacturing process. In recent years, these techniques have been successfully applied to quality and safety improvement in the medical field. One of these techniques is kaizen, which is the Japanese word for "good change." The central tenant of kaizen is the quick analysis of the small, manageable components of a problem and the rapid implementation of a solution with ongoing, real-time reassessment. Kaizen adds an additional "human element" that all stakeholders, not just management, must be involved in such change. Because of the small size of the changes involved in a kaizen event and the inherent focus on human factors and change management, a kaizen event can serve as good introduction to continuous quality improvement for a radiology department. Copyright © 2014. Published by Elsevier Inc.

  15. Probiotics production and alternative encapsulation methodologies to improve their viabilities under adverse environmental conditions.

    Science.gov (United States)

    Coghetto, Chaline Caren; Brinques, Graziela Brusch; Ayub, Marco Antônio Záchia

    2016-12-01

    Probiotic products are dietary supplements containing live microorganisms producing beneficial health effects on the host by improving intestinal balance and nutrient absorption. Among probiotic microorganisms, those classified as lactic acid bacteria are of major importance to the food and feed industries. Probiotic cells can be produced using alternative carbon and nitrogen sources, such as agroindustrial residues, at the same time contributing to reduce process costs. On the other hand, the survival of probiotic cells in formulated food products, as well as in the host gut, is an essential nutritional aspect concerning health benefits. Therefore, several cell microencapsulation techniques have been investigated as a way to improve cell viability and survival under adverse environmental conditions, such as the gastrointestinal milieu of hosts. In this review, different aspects of probiotic cells and technologies of their related products are discussed, including formulation of culture media, and aspects of cell microencapsulation techniques required to improve their survival in the host.

  16. An Improved Methodology for Multidimensional High-Throughput Preformulation Characterization of Protein Conformational Stability

    Science.gov (United States)

    Maddux, Nathaniel R.; Rosen, Ilan T.; Hu, Lei; Olsen, Christopher M.; Volkin, David B.; Middaugh, C. Russell

    2013-01-01

    The Empirical Phase Diagram (EPD) technique is a vector-based multidimensional analysis method for summarizing large data sets from a variety of biophysical techniques. It can be used to provide comprehensive preformulation characterization of a macromolecule’s higher-order structural integrity and conformational stability. In its most common mode, it represents a type of stimulus-response diagram using environmental variables such as temperature, pH, and ionic strength as the stimulus, with alterations in macromolecular structure being the response. Until now EPD analysis has not been available in a high throughput mode because of the large number of experimental techniques and environmental stressor/stabilizer variables typically employed. A new instrument has been developed that combines circular dichroism, UV-absorbance, fluorescence spectroscopy and light scattering in a single unit with a 6-position temperature controlled cuvette turret. Using this multifunctional instrument and a new software system we have generated EPDs for four model proteins. Results confirm the reproducibility of the apparent phase boundaries and protein behavior within the boundaries. This new approach permits two EPDs to be generated per day using only 0.5 mg of protein per EPD. Thus, the new methodology generates reproducible EPDs in high-throughput mode, and represents the next step in making such determinations more routine. PMID:22447621

  17. Application of Response Surface Methodology for the Technological Improvement of Solid Lipid Nanoparticles.

    Science.gov (United States)

    Dal Pizzol, Carine; O'Reilly, Andre; Winter, Evelyn; Sonaglio, Diva; de Campos, Angela Machado; Creczynski-Pasa, Tânia Beatriz

    2016-02-01

    Solid lipid nanoparticles (SLN) are colloidal particles consisting of a matrix composed of solid (at room and body temperatures) lipids dispersed in aqueous emulsifier solution. During manufacture, their physicochemical properties may be affected by several formulation parameters, such as type and concentration of lipid, proportion of emulsifiers and amount of solvent. Thus, the aim of this work was to study the influence of these variables on the preparation of SLN. A D-optimal Response Surface Methodology design was used to establish a mathematical model for the optimization of SLN. A total of 30 SLN formulations were prepared using the ultrasound method, and then characterized on the basis of their physicochemical properties, including particle size, polydispersity index (PI) and Zeta Potential (s). Particle sizes ranged between 107 and 240 nm. All SLN formulations showed negative sigma and PI values below 0.28. Prediction of the optimal conditions was performed using the desirability function targeting the reduction of all responses. The optimized SLN formulation showed similar theoretical and experimental values, confirming the sturdiness and predictive ability of the mathematical model for SLN optimization.

  18. Improvement of Folate Biosynthesis by Lactic Acid Bacteria Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Norfarina Muhamad Nor

    2010-01-01

    Full Text Available Lactic acid bacteria (Lactococcus lactis NZ9000, Lactococcus lactis MG1363, Lactobacillus plantarum I-UL4 and Lactobacillus johnsonii DSM 20553 have been screened for their ability to produce folate intracellularly and/or extracellularly. L. plantarum I-UL4 was shown to be superior producer of folate compared to other strains. Statistically based experimental designs were used to optimize the medium formulation for the growth of L. plantarum I-UL4 and folate biosynthesis. The optimal values of important factors were determined by response surface methodology (RSM. The effects of carbon sources, nitrogen sources and para-aminobenzoic acid (PABA concentrations on folate biosynthesis were determined prior to RSM study. The biosynthesis of folate by L. plantarum I-UL4 increased from 36.36 to 60.39 µg/L using the optimized medium formulation compared to the selective Man de Rogosa Sharpe (MRS medium. Conditions for the optimal growth of L. plantarum I-UL4 and folate biosynthesis as suggested by RSM were as follows: lactose 20 g/L, meat extract 16.57 g/L and PABA 10 µM.

  19. An improved experimental methodology to evaluate the effectiveness of protective gloves against nanoparticles in suspension.

    Science.gov (United States)

    Vinches, Ludwig; Zemzem, Mohamed; Hallé, Stéphane; Peyro, Caroline; Wilkinson, Kevin J; Tufenkji, Nathalie

    2017-07-01

    Recent studies underline the potential health risks associated to the "nano" revolution, particularly for the workers who handle engineered nanoparticles (ENPs) that can be found in the formulation of several commercial products. Although many Health & Safety agencies recommend the use of protective gloves against chemicals, few studies have investigated the effectiveness of these gloves towards nanoparticle suspensions. Moreover, the data that are available are often contradictory. This study was designed to evaluate the effectiveness of protective gloves against nanoparticles in suspension. For this purpose, a new methodology was developed in order to take into account parameters encountered in the workplace such as mechanical deformations (MD) that simulate hand flexion and sweat. The effects of the precise experimental protocol on the concentrations of nanoparticles that were detected in the sampling suspension were assessed. Several samples of nitrile rubber gloves (73 µm thick), taken from different boxes, were brought into contact with gold nanoparticles (5 nm) in water. During their exposure to ENPs, the glove samples submitted systematic mechanical deformations and were placed in contact with a physiological solution simulating human sweat. Under these conditions, results obtained by inductively coupled plasma mass spectrometry (ICPMS) showed that the 5 nm gold nanoparticles passed through the protective gloves. This result was acquired, in spite of the observation of significant losses during the sampling phase that will be important for future experiments evaluating the effectiveness of these materials.

  20. A novel methodology improves reservoir characterization models using geologic fuzzy variables

    Energy Technology Data Exchange (ETDEWEB)

    Soto B, Rodolfo [DIGITOIL, Maracaibo (Venezuela); Soto O, David A. [Texas A and M University, College Station, TX (United States)

    2004-07-01

    One of the research projects carried out in Cusiana field to explain its rapid decline during the last years was to get better permeability models. The reservoir of this field has a complex layered system that it is not easy to model using conventional methods. The new technique included the development of porosity and permeability maps from cored wells following the same trend of the sand depositions for each facie or layer according to the sedimentary facie and the depositional system models. Then, we used fuzzy logic to reproduce those maps in three dimensions as geologic fuzzy variables. After multivariate statistical and factor analyses, we found independence and a good correlation coefficient between the geologic fuzzy variables and core permeability and porosity. This means, the geologic fuzzy variable could explain the fabric, the grain size and the pore geometry of the reservoir rock trough the field. Finally, we developed a neural network permeability model using porosity, gamma ray and the geologic fuzzy variable as input variables. This model has a cross-correlation coefficient of 0.873 and average absolute error of 33% compared with the actual model with a correlation coefficient of 0.511 and absolute error greater than 250%. We tested different methodologies, but this new one showed dramatically be a promiser way to get better permeability models. The use of the models have had a high impact in the explanation of well performance and workovers, and reservoir simulation models. (author)

  1. Measuring performance improvement: total organizational commitment or clinical specialization.

    Science.gov (United States)

    Caron, Aleece; Jones, Paul; Neuhauser, Duncan; Aron, David C

    2004-01-01

    Resources for hospitals are limited when they are faced with multiple publicly reported performance measures as tools to assess quality. The leadership in these organizations may choose to focus on 1 or 2 of these outcomes. An alternative approach is that the leadership may commit resources or create conditions that result in improved quality over a broad range of measures. We used aggregated data on mortality, length of stay, and obstetrical outcomes from Greater Cleveland Health Quality Choice data to test these theories. We used Pearson correlation analysis to determine of outcomes were correlated with one another. We used repeated-measures ANOVA to determine if an association existed between outcome and time and outcome and hospital. All of the outcomes across all hospitals demonstrate a trend of overall improvement. Both the Pearson and ANOVA result support the hypothesis for the organization-wide approach to quality improvement. Hospital that make improvements in one clinical area trend to make improvements in others. Hospitals that produce improvements in limited clinical or administrative areas may not have completely adopted CQI into their culture or may not have yet realized the benefits of their organizational commitments, but use some of the concepts to improve quality outcomes.

  2. An Improved Methodology to Overcome Key Issues in Human Fecal Metagenomic DNA Extraction

    DEFF Research Database (Denmark)

    Kumar, Jitendra; Kumar, Manoj; Gupta, Shashank

    2016-01-01

    Microbes are ubiquitously distributed in nature, and recent culture-independent studies have highlighted the significance of gut microbiota in human health and disease. Fecal DNA is the primary source for the majority of human gut microbiome studies. However, further improvement is needed to obta...

  3. Leveraging Competency Framework to Improve Teaching and Learning: A Methodological Approach

    Science.gov (United States)

    Shankararaman, Venky; Ducrot, Joelle

    2016-01-01

    A number of engineering education programs have defined learning outcomes and course-level competencies, and conducted assessments at the program level to determine areas for continuous improvement. However, many of these programs have not implemented a comprehensive competency framework to support the actual delivery and assessment of an…

  4. Towards a global CO2 calculation standard for supply chains: Suggestions for methodological improvements

    NARCIS (Netherlands)

    Davydenko, I.; Ehrler, V.; Ree, D. de; Lewis, A.; Tavasszy, L.

    2014-01-01

    Improving the efficiency and sustainability of supply chains is a shared aim of the transport industry, its customers, governments as well as industry organisations. To optimize supply chains and for the identification of best practice, standards for their analysis are needed in order to achieve

  5. Training and Action for Patient Safety: Embedding Interprofessional Education for Patient Safety within an Improvement Methodology

    Science.gov (United States)

    Slater, Beverley L.; Lawton, Rebecca; Armitage, Gerry; Bibby, John; Wright, John

    2012-01-01

    Introduction: Despite an explosion of interest in improving safety and reducing error in health care, one important aspect of patient safety that has received little attention is a systematic approach to education and training for the whole health care workforce. This article describes an evaluation of an innovative multiprofessional, team-based…

  6. Аccounting and methodological aspects of capital expenditure for land improvement

    Directory of Open Access Journals (Sweden)

    J.P. Melnychuk

    2016-07-01

    Full Text Available The article highlights the process of reflection in accounting the capital costs for land improvement. The main legislation governing this issue is covered. Also the article has agreed the key issues that ensure in accounting for capital expenditures for farmland improving. The survey has benefited such general scientific methods as: induction and deduction, dialectic, historical and systematic methods and specific methods of accounting. Due to the land reform the ownership of the land was changed. Lands which were owned by farms have been privatized and have received a particular owner. Now privatized lands constitute a significant part of farmland. The land managers require quality accounting information about composition and state of the land and improvements that occur to make an effective management. The numerous changes in legislation generate controversies in their interpretation and, consequently, it results in appearance of the discrepancies in the conduct of cost accounting for capital land improvement which will effect on the amount of net profit in future. The article reflects the economic substance of the process and fundamentally describes the implementation method of accounting for capital expenditure for land in accordance with the applicable law.

  7. Control Charts in Healthcare Quality Improvement A Systematic Review on Adherence to Methodological Criteria

    NARCIS (Netherlands)

    Koetsier, A.; van der Veer, S. N.; Jager, K. J.; Peek, N.; de Keizer, N. F.

    2012-01-01

    Objectives: Use of Shewhart control charts in quality improvement (QI) initiatives is increasing. These charts are typically used in one or more phases of the Plan Do Study Act (PDSA) cycle to monitor summaries of process and outcome data, abstracted from clinical information systems, over time. We

  8. Approaches for improving present laboratory and field methodology for evaluation efficacy of transgenic technologies

    Science.gov (United States)

    Assessing the efficacy of transgenic plants under new environmental and management regimes is of prime importance to the companies which produce new or improved existing transgenic products, breeders which create different varieties stacked with Bt endotoxins, and growers who use them for production...

  9. Software to improve spent fuel measurements using the FDET

    International Nuclear Information System (INIS)

    Staples, P.; Beddingfield, D.; Lestone, J.; Pelowitz, D.; Sprinkle, J.; Bytchkov, V.; Starovich, Z.; Harizanov, I.; Vellejo-Luna, J.; Lavender, C.

    2001-01-01

    Full text: Vast quantities of spent fuel are available for safeguard measurements, primarily in Commonwealth of Independent States (CIS) of the former Soviet Union. This spent fuel, much of which consists of long cooling time material, is going to become less unique in the world safeguards arena as reprocessing projects or permanent repositories continue to be delayed or postponed. The long cooling time of many of the spent fuel assemblies in the CIS countries being prepared for intermediate term storage promotes the possibility of increased accuracy in spent fuel assays. An important point to consider for the future that could advance safeguards measurements for re-verification and inspection measurements would be to determine what safeguards requirements should be imposed upon this 'new' class of spent fuel. Improvements in measurement capability will obviously affect the safeguards requirements. What most significantly enables this progress in spent fuel measurements is the improvement in computer processing power and software enhancements leading to user-friendly Graphical User Interfaces (GUI's). The software used for these projects significantly reduces the IAEA inspector's time both learning and operating computer and data acquisition systems. While at the same time by standardizing the spent fuel measurements it is possible to increase reproducibility and reliability of the measurement data. The inclusion of various analysis algorithms into the operating environment, which can be performed in real time upon the measurement data, can also lead to increases in safeguard reliability and improvements in efficiency to plant operations. (author)

  10. Evaluating a questionnaire to measure improvement initiatives in Swedish healthcare

    Directory of Open Access Journals (Sweden)

    Andersson Ann-Christine

    2013-02-01

    Full Text Available Abstract Background Quality improvement initiatives have expanded recently within the healthcare sector. Studies have shown that less than 40% of these initiatives are successful, indicating the need for an instrument that can measure the progress and results of quality improvement initiatives and answer questions about how quality initiatives are conducted. The aim of the present study was to develop and test an instrument to measure improvement process and outcome in Swedish healthcare. Methods A questionnaire, founded on the Minnesota Innovation Survey (MIS, was developed in several steps. Items were merged and answer alternatives were revised. Employees participating in a county council improvement program received the web-based questionnaire. Data was analysed by descriptive statistics and correlation analysis. The questionnaire psychometric properties were investigated and an exploratory factor analysis was conducted. Results The Swedish Improvement Measurement Questionnaire consists of 27 items. The Improvement Effectiveness Outcome dimension consists of three items and has a Cronbach’s alpha coefficient of 0.67. The Internal Improvement Processes dimension consists of eight sub-dimensions with a total of 24 items. Cronbach’s alpha coefficient for the complete dimension was 0.72. Three significant item correlations were found. A large involvement in the improvement initiative was shown and the majority of the respondents were satisfied with their work. Conclusions The psychometric property tests suggest initial support for the questionnaire to study and evaluate quality improvement initiatives in Swedish healthcare settings. The overall satisfaction with the quality improvement initiative correlates positively to the awareness of individual responsibilities.

  11. Improved collaborative filtering recommendation algorithm of similarity measure

    Science.gov (United States)

    Zhang, Baofu; Yuan, Baoping

    2017-05-01

    The Collaborative filtering recommendation algorithm is one of the most widely used recommendation algorithm in personalized recommender systems. The key is to find the nearest neighbor set of the active user by using similarity measure. However, the methods of traditional similarity measure mainly focus on the similarity of user common rating items, but ignore the relationship between the user common rating items and all items the user rates. And because rating matrix is very sparse, traditional collaborative filtering recommendation algorithm is not high efficiency. In order to obtain better accuracy, based on the consideration of common preference between users, the difference of rating scale and score of common items, this paper presents an improved similarity measure method, and based on this method, a collaborative filtering recommendation algorithm based on similarity improvement is proposed. Experimental results show that the algorithm can effectively improve the quality of recommendation, thus alleviate the impact of data sparseness.

  12. Integrated methodology for production related risk management of vehicle electronics (IMPROVE)

    OpenAIRE

    Geis, Stefan Rafael

    2006-01-01

    This scientific work is designated to provide an innovative and integrated conceptional approach to improve the assembly quality of automotive electronics. This is achieved by the reduction and elimination of production related risks of automotive electronics and the implementation of a sustainable solution process. The focus is the development and implementation of an integrated technical risk management approach for automotive electronics throughout the vehicle life cycle and the vehicle pr...

  13. Improved Temperature Sounding and Quality Control Methodology Using AIRS/AMSU Data: The AIRS Science Team Version 5 Retrieval Algorithm

    Science.gov (United States)

    Susskind, Joel; Blaisdell, John M.; Iredell, Lena; Keita, Fricky

    2009-01-01

    This paper describes the AIRS Science Team Version 5 retrieval algorithm in terms of its three most significant improvements over the methodology used in the AIRS Science Team Version 4 retrieval algorithm. Improved physics in Version 5 allows for use of AIRS clear column radiances in the entire 4.3 micron CO2 absorption band in the retrieval of temperature profiles T(p) during both day and night. Tropospheric sounding 15 micron CO2 observations are now used primarily in the generation of clear column radiances .R(sub i) for all channels. This new approach allows for the generation of more accurate values of .R(sub i) and T(p) under most cloud conditions. Secondly, Version 5 contains a new methodology to provide accurate case-by-case error estimates for retrieved geophysical parameters and for channel-by-channel clear column radiances. Thresholds of these error estimates are used in a new approach for Quality Control. Finally, Version 5 also contains for the first time an approach to provide AIRS soundings in partially cloudy conditions that does not require use of any microwave data. This new AIRS Only sounding methodology, referred to as AIRS Version 5 AO, was developed as a backup to AIRS Version 5 should the AMSU-A instrument fail. Results are shown comparing the relative performance of the AIRS Version 4, Version 5, and Version 5 AO for the single day, January 25, 2003. The Goddard DISC is now generating and distributing products derived using the AIRS Science Team Version 5 retrieval algorithm. This paper also described the Quality Control flags contained in the DISC AIRS/AMSU retrieval products and their intended use for scientific research purposes.

  14. A Simple and Efficient Methodology To Improve Geometric Accuracy in Gamma Knife Radiation Surgery: Implementation in Multiple Brain Metastases

    Energy Technology Data Exchange (ETDEWEB)

    Karaiskos, Pantelis, E-mail: pkaraisk@med.uoa.gr [Medical Physics Laboratory, Medical School, University of Athens (Greece); Gamma Knife Department, Hygeia Hospital, Athens (Greece); Moutsatsos, Argyris; Pappas, Eleftherios; Georgiou, Evangelos [Medical Physics Laboratory, Medical School, University of Athens (Greece); Roussakis, Arkadios [CT and MRI Department, Hygeia Hospital, Athens (Greece); Torrens, Michael [Gamma Knife Department, Hygeia Hospital, Athens (Greece); Seimenis, Ioannis [Medical Physics Laboratory, Medical School, Democritus University of Thrace, Alexandroupolis (Greece)

    2014-12-01

    Purpose: To propose, verify, and implement a simple and efficient methodology for the improvement of total geometric accuracy in multiple brain metastases gamma knife (GK) radiation surgery. Methods and Materials: The proposed methodology exploits the directional dependence of magnetic resonance imaging (MRI)-related spatial distortions stemming from background field inhomogeneities, also known as sequence-dependent distortions, with respect to the read-gradient polarity during MRI acquisition. First, an extra MRI pulse sequence is acquired with the same imaging parameters as those used for routine patient imaging, aside from a reversal in the read-gradient polarity. Then, “average” image data are compounded from data acquired from the 2 MRI sequences and are used for treatment planning purposes. The method was applied and verified in a polymer gel phantom irradiated with multiple shots in an extended region of the GK stereotactic space. Its clinical impact in dose delivery accuracy was assessed in 15 patients with a total of 96 relatively small (<2 cm) metastases treated with GK radiation surgery. Results: Phantom study results showed that use of average MR images eliminates the effect of sequence-dependent distortions, leading to a total spatial uncertainty of less than 0.3 mm, attributed mainly to gradient nonlinearities. In brain metastases patients, non-eliminated sequence-dependent distortions lead to target localization uncertainties of up to 1.3 mm (mean: 0.51 ± 0.37 mm) with respect to the corresponding target locations in the “average” MRI series. Due to these uncertainties, a considerable underdosage (5%-32% of the prescription dose) was found in 33% of the studied targets. Conclusions: The proposed methodology is simple and straightforward in its implementation. Regarding multiple brain metastases applications, the suggested approach may substantially improve total GK dose delivery accuracy in smaller, outlying targets.

  15. Fundamental and methodological investigations for the improvement of elemental analysis by inductively coupled plasma mass soectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, Christopher Hysjulien [Ames Lab., Ames, IA (United States)

    2012-01-01

    This dissertation describes a variety of studies meant to improve the analytical performance of inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation (LA) ICP-MS. The emission behavior of individual droplets and LA generated particles in an ICP is studied using a high-speed, high frame rate digital camera. Phenomena are observed during the ablation of silicate glass that would cause elemental fractionation during analysis by ICP-MS. Preliminary work for ICP torch developments specifically tailored for the improvement of LA sample introduction are presented. An abnormal scarcity of metal-argon polyatomic ions (MAr{sup +}) is observed during ICP-MS analysis. Evidence shows that MAr{sup +} ions are dissociated by collisions with background gas in a shockwave near the tip of the skimmer cone. Method development towards the improvement of LA-ICP-MS for environmental monitoring is described. A method is developed to trap small particles in a collodion matrix and analyze each particle individually by LA-ICP-MS.

  16. A METHODOLOGY FOR IMPROVING PRODUCTIVITY OF THE EXISTING SHIPBUILDING PROCESS USING MODERN PRODUCTION CONCEPTs AND THE AHP METHOD

    Directory of Open Access Journals (Sweden)

    Venesa Stanić

    2017-01-01

    Full Text Available In recent years, shipyards have been facing difficulties in controlling operational costs. To maintain continual operation of all of the facilities, a shipyard must analyze ways of utilizing present production systems for assembling interim vessel products as well as other types of industrial constructions. In the past, new machines continuously improved shipbuilding processes, including software and organizational restructuring, but management continued to search for a modern technological concept that will provide higher productivity, greater profit and overall reduction in costs. In the article the authors suggest implementing Design for Production, Design for Maintainability and Group Technology principles using the Analytical Hierarchy Process (AHP to apply to multi criteria decision making methods as an efficient tool for maintaining international competitiveness in the modern shipbuilding industry. This novel methodology is implemented through four phases. In the first phase, the present situation analysis is suggested for a real shipyard by establishing closest relations among production lines. The second phase presents a constraint analysis that must be evaluated when developing the design solution. The third phase involves generating a typical number of selected alternatives of the Design for Production, Design for Maintainability and Group Technology principles. In the fourth phase, the optimal design solution is selected using the Analytical Hierarchy Process (AHP method. The solution incorporating this modern methodology will improve productivity, profit and lead to decreasing operational costs.

  17. Improving Robustness Assessment Methodologies for Structures Impacted by Missiles (IRIS-2012) - Final Report

    International Nuclear Information System (INIS)

    Orbovic, Neb; Blahoainu, Andrei; Sagals, Genadis; Tarallo, Francois; Rambach, Jean-Mathieu; Huerta, Alejandro; White, Andrew; Nevander, Olli; ); Riera, Jorge Daniel; Krauthammer, Ted; Krutzik, Norbert; Arros, Jorma; Rouqand, Alain; Stangenberg, Friedhelm; Schwer, Leonard E.

    2014-01-01

    This report documents the results and conclusions of the second phase of the Integrity and Ageing of Components and Structures Working Group (WGIAGE) activity 'Improving Robustness assessment of structures Impacted by missiles', called IRIS-2012. The objective of the activity was to conduct a post-test benchmark study to improve models and evaluation techniques used in IRIS-2010. The benchmark was open to the new participants and some of IRIS-2010 participants did not take part of IRIS-2012. For this reason the team numbers in two benchmarks are different and to make direct comparisons it is necessary to have both lists. For IRIS-2010 benchmark a series of repeated test was performed: two bending rupture tests and three punching rupture tests. For IRIS-2012 and based on recommendation from IRIS-2010, tri-axial tests and Brazilian tensile test were additionally performed in order to calibrate constitutive models. The benchmark was officially launched in February 2012 with the participation of twenty six teams from twenty different institutions (Safety Authorities, TSOs, Utilities, Vendors, Research Institutes and Consulting Companies), from ten different countries from Europe, North America and Asia (plus 1 international organisation). A three day workshop was convened in October 2012 in Ottawa, Ontario, Canada where each participating team presented and discussed their results and performed simulations. Based on IRIS-2010 results and recommendations, OECD/NEA members recognized that there was a need to continue the work on understanding and improving simulation of structural impact. The goal of the new IRIS-2012 benchmark was to: 1) Update and improve existing FE models, for teams that participated in IRIS-2010, or to create new models for new participants. In order to improve FE models it was requested to: Simulate uni-axial unconfined concrete test and tri-axial concrete tests, using the results provided by IRSN, as well as the Brazilian test (concrete tensile

  18. TO METHODOLOGY FOR IMPROVEMENT OF ACTIVE VIBRO-PROTECTION WHILE USING FUNCTIONAL DIAGNOSTICS

    Directory of Open Access Journals (Sweden)

    T. N. Mikulik

    2014-01-01

    Full Text Available The paper investigates vibro-protection conditions for “operator-chair” system of a transport facility (“Belarus-tractorfamily. Experimental  research for the system vibroloading with due account of elastic shock-absorbing characteristics, operator’s comfortability. The paper has made it possible to determine a range of the system vibration frequency which is badlysustained by the operator because the last is located in the zone of natural frequency of human visceral organs vibrationsInfluence of physiological operator’s factors – heartbeat rate, variational height, mode amplitude, stress index has been investigated on the basis of a factor experiment and correlation dependences have been obtained in the paper. The developed methodology for investigation of algorithmic provision pertaining to better active vibroprotection of the “operator-chair” system presupposes an availability of mathematical model used for synthesis of control laws and selection of algorithms for formation of signals on physiological operator’s state. Structural algorithm scheme for vibroprotection of “driver – seat – road” system has been drawn in the paper. Harmonic sinusoidal and poly-harmonic disturbances from the side of a power unit and discrete algorithms based on filtration of white noise with a linear filter and  prescribed correlation function have been accepted as a mathematical model for external environment disturbances. In case of harmonic excitation of “operator – chair” system  a force transferred to the system by a shock-absorber and also shock-absorber efficiency evaluation in the form of force transmission coefficient and vibration insulation value are estimated at decibels. Fourier’s series describes motion of the system in case of vibration forces initiated by the operation of the power unit. Piecewise-linear function describes a reaction on impact excitation of the system when final change in speed and motion

  19. Improving methodology in open vessel digestion with a graphite heating block (T7)

    International Nuclear Information System (INIS)

    Kainrath, P.; Conrads, B.; Ross, A.

    2002-01-01

    Full text: Open block digestion systems have been very popular in environmental analysis over the past decades, but have consistently suffered from the major drawback of their sensitivity against corrosion and the subsequent risk of contamination. Therefore block digestion systems have not been considered state-of-the-art technology in trace and ultra trace sample preparation. Graphite block digestion systems are well established in North America and are recently becoming more frequently considered in Europe. These systems overcome the deficiencies of the traditional systems, made from stainless steel or aluminum, because the block is manufactured from graphite and typically coated with a fluoro-polymer to present the possibility metallic contamination from the surface of the system during the handling of the samples. Graphite block systems present an alternative to the current mainstream technology of open and closed vessel microwave assisted digestion systems, as they allow large numbers of samples to be digested simultaneously, thus overcoming one of the major weaknesses of closed vessel systems. More recently a number of improvements in the technology has been developed for graphite block digestion systems and studies have been performed to evaluate the effects of such improvements. The paper presented will deal with the technological improvements: monitoring and control of sample temperature vs. monitoring of block temperature, elimination of cross contamination effects during open vessel block digestion, evaporation of samples for pre-concentration or multiple digestion steps, addressing the needs of various labs and applications for block digesters. The effects of those developments will be discussed; application examples and finally an outlook into possible future trends for graphite block digestion systems will be given. (author)

  20. Projecting future expansion of invasive species: comparing and improving methodologies for species distribution modeling.

    Science.gov (United States)

    Mainali, Kumar P; Warren, Dan L; Dhileepan, Kunjithapatham; McConnachie, Andrew; Strathie, Lorraine; Hassan, Gul; Karki, Debendra; Shrestha, Bharat B; Parmesan, Camille

    2015-12-01

    Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships for Parthenium hysterophorus L. (Asteraceae) with four modeling methods run with multiple scenarios of (i) sources of occurrences and geographically isolated background ranges for absences, (ii) approaches to drawing background (absence) points, and (iii) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved using a global dataset for model training, rather than restricting data input to the species' native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e., into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g., boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post hoc test conducted on a new Parthenium dataset from Nepal validated excellent predictive performance of our 'best' model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for parthenium

  1. Gust factor based on research aircraft measurements: A new methodology applied to the Arctic marine boundary layer

    DEFF Research Database (Denmark)

    Suomi, Irene; Lüpkes, Christof; Hartmann, Jörg

    2016-01-01

    There is as yet no standard methodology for measuring wind gusts from a moving platform. To address this, we have developed a method to derive gusts from research aircraft data. First we evaluated four different approaches, including Taylor's hypothesis of frozen turbulence, to derive the gust...... in unstable conditions (R2=0.52). The mean errors for all methods were low, from -0.02 to 0.05, indicating that wind gust factors can indeed be measured from research aircraft. Moreover, we showed that aircraft can provide gust measurements within the whole boundary layer, if horizontal legs are flown...

  2. State of the art on nuclear heating measurement methods and expected improvements in zero power research reactors

    International Nuclear Information System (INIS)

    Le Guillou, M.; Gruel, A.; Destouches, C.; Blaise, P.

    2017-01-01

    The paper focuses on the recent methodological advances suitable for nuclear heating measurements in zero power research reactors. This bibliographical work is part of an experimental approach currently in progress at CEA Cadarache, aiming at optimizing photon heating measurements in low-power research reactors. It provides an overview of the application fields of the most widely used detectors, namely thermoluminescent dosimeters (TLDs) and optically stimulated luminescent dosimeters. Starting from the methodology currently implemented at CEA, the expected improvements relate to the experimental determination of the neutron component, which is a key point conditioning the accuracy of photon heating measurements in mixed n-γ field. A recently developed methodology based on the use of "7Li and "6Li-enriched TLDs, pre-calibrated both in photon and neutron fields, is a promising approach to de-convolute the 2 components of nuclear heating. We also investigate the different methods of optical fiber dosimetry, with a view to assess the feasibility of online photon heating measurements, whose primary benefit is to overcome constraints related to the withdrawal of dosimeters from the reactor immediately after irradiation. Moreover, a fiber-using setup could allow measuring the instantaneous dose rate during irradiation, as well as the delayed photon dose after reactor shutdown. Some insights from potential further developments are given. Obviously, any improvement of the technique has to lead to a measurement uncertainty at least equal to that of the currently used methodology (∼5% at 1 σ). (authors)

  3. State of the art on nuclear heating measurement methods and expected improvements in zero power research reactors

    Directory of Open Access Journals (Sweden)

    Le Guillou Mael

    2017-01-01

    Full Text Available The paper focuses on the recent methodological advances suitable for nuclear heating measurements in zero power research reactors. This bibliographical work is part of an experimental approach currently in progress at CEA Cadarache, aiming at optimizing photon heating measurements in low-power research reactors. It provides an overview of the application fields of the most widely used detectors, namely thermoluminescent dosimeters (TLDs and optically stimulated luminescent dosimeters. Starting from the methodology currently implemented at CEA, the expected improvements relate to the experimental determination of the neutron component, which is a key point conditioning the accuracy of photon heating measurements in mixed n–γ field. A recently developed methodology based on the use of 7Li and 6Li-enriched TLDs, precalibrated both in photon and neutron fields, is a promising approach to deconvolute the two components of nuclear heating. We also investigate the different methods of optical fiber dosimetry, with a view to assess the feasibility of online photon heating measurements, whose primary benefit is to overcome constraints related to the withdrawal of dosimeters from the reactor immediately after irradiation. Moreover, a fibered setup could allow measuring the instantaneous dose rate during irradiation, as well as the delayed photon dose after reactor shutdown. Some insights from potential further developments are given. Obviously, any improvement of the technique has to lead to a measurement uncertainty at least equal to that of the currently used methodology (∼5% at 1σ.

  4. Measuring sporadic gastrointestinal illness associated with drinking water - an overview of methodologies.

    Science.gov (United States)

    Bylund, John; Toljander, Jonas; Lysén, Maria; Rasti, Niloofar; Engqvist, Jannes; Simonsson, Magnus

    2017-06-01

    There is an increasing awareness that drinking water contributes to sporadic gastrointestinal illness (GI) in high income countries of the northern hemisphere. A literature search was conducted in order to review: (1) methods used for investigating the effects of public drinking water on GI; (2) evidence of possible dose-response relationship between sporadic GI and drinking water consumption; and (3) association between sporadic GI and factors affecting drinking water quality. Seventy-four articles were selected, key findings and information gaps were identified. In-home intervention studies have only been conducted in areas using surface water sources and intervention studies in communities supplied by ground water are therefore needed. Community-wide intervention studies may constitute a cost-effective alternative to in-home intervention studies. Proxy data that correlate with GI in the community can be used for detecting changes in the incidence of GI. Proxy data can, however, not be used for measuring the prevalence of illness. Local conditions affecting water safety may vary greatly, making direct comparisons between studies difficult unless sufficient knowledge about these conditions is acquired. Drinking water in high-income countries contributes to endemic levels of GI and there are public health benefits for further improvements of drinking water safety.

  5. Reliability improvements on Thales RM2 rotary Stirling coolers: analysis and methodology

    Science.gov (United States)

    Cauquil, J. M.; Seguineau, C.; Martin, J.-Y.; Benschop, T.

    2016-05-01

    The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The cooler reliability is thus one of its most important parameters. This parameter has to increase to answer market needs. To do this, the data for identifying the weakest element determining cooler reliability has to be collected. Yet, data collection based on field are hardly usable due to lack of informations. A method for identifying the improvement in reliability has then to be set up which can be used even without field return. This paper will describe the method followed by Thales Cryogénie SAS to reach such a result. First, a database was built from extensive expertizes of RM2 failures occurring in accelerate ageing. Failure modes have then been identified and corrective actions achieved. Besides this, a hierarchical organization of the functions of the cooler has been done with regard to the potential increase of its efficiency. Specific changes have been introduced on the functions most likely to impact efficiency. The link between efficiency and reliability will be described in this paper. The work on the two axes - weak spots for cooler reliability and efficiency - permitted us to increase in a drastic way the MTTF of the RM2 cooler. Huge improvements in RM2 reliability are actually proven by both field return and reliability monitoring. These figures will be discussed in the paper.

  6. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  7. Selective critique of risk assessments with recommendations for improving methodology and practise

    International Nuclear Information System (INIS)

    Aven, Terje

    2011-01-01

    Risk assessments are often criticised for defending activities that could harm the environment and human health. The risk assessments produce numbers which are used to prove that the risk associated with the activity is acceptable. In this way, risk assessments seem to be a tool generally serving business. Government agencies have based their regulations on the use of risk assessment and the prevailing practise is supported by the regulations. In this paper, we look more closely into this critique. Are risk assessments being misused or are risk assessments simply not a suitable tool for guiding decision-making in the face of risks and uncertainties? Is the use of risk assessments not servicing public interests? We argue that risk assessments may provide useful decision support but the quality of the risk assessments and the associated risk assessment processes need to be improved. In this paper, three main improvement areas (success factors) are identified and discussed: (1) the scientific basis of the risk assessments needs to be strengthened, (2) the risk assessments need to provide a much broader risk picture than what is typically the case today. Separate uncertainty analyses should be carried out, extending the traditional probabilistic-based analyses and (3) the cautionary and precautionary principles need to be seen as rational risk management approaches, and their application would, to a large extent, be based on risk and uncertainty assessments.

  8. Improving Emergency Department radiology transportation time: a successful implementation of lean methodology.

    Science.gov (United States)

    Hitti, Eveline A; El-Eid, Ghada R; Tamim, Hani; Saleh, Rana; Saliba, Miriam; Naffaa, Lena

    2017-09-05

    Emergency Department overcrowding has become a global problem and a growing safety and quality concern. Radiology and laboratory turnaround time, ED boarding and increased ED visits are some of the factors that contribute to ED overcrowding. Lean methods have been used in the ED to address multiple flow challenges from improving door-to-doctor time to reducing length of stay. The objective of this study is to determine the effectiveness of using Lean management methods on improving Emergency Department transportation times for plain radiography. We performed a before and after study at an academic urban Emergency Department with 49,000 annual visits after implementing a Lean driven intervention. The primary outcome was mean radiology transportation turnaround time (TAT). Secondary outcomes included overall study turnaround time from order processing to preliminary report time as well as ED length of stay. All ED patients undergoing plain radiography 6 months pre-intervention were compared to all ED patients undergoing plain radiography 6 months post-intervention after a 1 month washout period. Post intervention there was a statistically significant decrease in the mean transportation TAT (mean ± SD: 9.87 min ± 15.05 versus 22.89 min ± 22.05, respectively, p-value <0.0001). In addition, it was found that 71.6% of patients in the post-intervention had transportation TAT ≤ 10 min, as compared to 32.3% in the pre-intervention period, p-value <0.0001, with narrower interquartile ranges in the post-intervention period. Similarly, the "study processing to preliminary report time" and the length of stay were lower in the post-intervention as compared to the pre-intervention, (52.50 min ± 35.43 versus 54.04 min ± 34.72, p-value = 0.02 and 3.65 h ± 5.17 versus 4.57 h ± 10.43, p < 0.0001, respectively), in spite of an increase in the time it took to elease a preliminary report in the post-intervention period. Using Lean change management

  9. A study of calculation methodology and experimental measurements of the kinetic parameters for source driven subcritical systems

    International Nuclear Information System (INIS)

    Lee, Seung Min

    2009-01-01

    This work presents a theoretical study of reactor kinetics focusing on the methodology of calculation and the experimental measurements of the so-called kinetic parameters. A comparison between the methodology based on the Dulla's formalism and the classical method is made. The objective is to exhibit the dependence of the parameters on subcriticality level and perturbation. Two different slab type systems were considered: thermal one and fast one, both with homogeneous media. One group diffusion model was used for the fast reactor, and for the thermal system, two groups diffusion model, considering, in both case, only one precursor's family. The solutions were obtained using the expansion method. Also, descriptions of the main experimental methods of measurements of the kinetic parameters are presented in order to put a question about the compatibility of these methods in subcritical region. (author)

  10. An Improved Cambridge Filter Pad Extraction Methodology to Obtain More Accurate Water and “Tar” Values: In Situ Cambridge Filter Pad Extraction Methodology

    Directory of Open Access Journals (Sweden)

    Ghosh David

    2014-07-01

    Full Text Available Previous investigations by others and internal investigations at Philip Morris International (PMI have shown that the standard trapping and extraction procedure used for conventional cigarettes, defined in the International Standard ISO 4387 (Cigarettes -- Determination of total and nicotine-free dry particulate matter using a routine analytical smoking machine, is not suitable for high-water content aerosols. Errors occur because of water losses during the opening of the Cambridge filter pad holder to remove the filter pad as well as during the manual handling of the filter pad, and because the commercially available filter pad holder, which is constructed out of plastic, may adsorb water. This results in inaccurate values for the water content, and erroneous and overestimated values for Nicotine Free Dry Particulate Matter (NFDPM. A modified 44 mm Cambridge filter pad holder and extraction equipment which supports in situ extraction methodology has been developed and tested. The principle of the in situ extraction methodology is to avoid any of the above mentioned water losses by extracting the loaded filter pad while kept in the Cambridge filter pad holder which is hermetically sealed by two caps. This is achieved by flushing the extraction solvent numerous times through the hermetically sealed Cambridge filter pad holder by means of an in situ extractor. The in situ methodology showed a significantly more complete water recovery, resulting in more accurate NFDPM values for high-water content aerosols compared to the standard ISO methodology. The work presented in this publication demonstrates that the in situ extraction methodology applies to a wider range of smoking products and smoking regimens, whereas the standard ISO methodology only applies to a limited range of smoking products and smoking regimens, e.g., conventional cigarettes smoked under ISO smoking regimen. In cases where a comparison of yields between the PMI HTP and

  11. Improved and Reproducible Flow Cytometry Methodology for Nuclei Isolation from Single Root Meristem

    Directory of Open Access Journals (Sweden)

    Thaís Cristina Ribeiro Silva

    2010-01-01

    Full Text Available Root meristems have increasingly been target of cell cycle studies by flow cytometric DNA content quantification. Moreover, roots can be an alternative source of nuclear suspension when leaves become unfeasible and for chromosome analysis and sorting. In the present paper, a protocol for intact nuclei isolation from a single root meristem was developed. This proceeding was based on excision of the meristematic region using a prototypical slide, followed by short enzymatic digestion and mechanical isolation of nuclei during homogenization with a hand mixer. Such parameters were optimized for reaching better results. Satisfactory nuclei amounts were extracted and analyzed by flow cytometry, producing histograms with reduced background noise and CVs between 3.2 and 4.1%. This improved and reproducible technique was shown to be rapid, inexpensive, and simple for nuclear extraction from a single root tip, and can be adapted for other plants and purposes.

  12. Protocol for using mixed methods and process improvement methodologies to explore primary care receptionist work.

    Science.gov (United States)

    Litchfield, Ian; Gale, Nicola; Burrows, Michael; Greenfield, Sheila

    2016-11-16

    The need to cope with an increasingly ageing and multimorbid population has seen a shift towards preventive health and effective management of chronic disease. This places general practice at the forefront of health service provision with an increased demand that impacts on all members of the practice team. As these pressures grow, systems become more complex and tasks delegated across a broader range of staff groups. These include receptionists who play an essential role in the successful functioning of the surgery and are a major influence on patient satisfaction. However, they do so without formal recognition of the clinical implications of their work or with any requirements for training and qualifications. Our work consists of three phases. The first will survey receptionists using the validated Work Design Questionnaire to help us understand more precisely the parameters of their role; the second involves the use of iterative focus groups to help define the systems and processes within which they work. The third and final phase will produce recommendations to increase the efficiency and safety of the key practice processes involving receptionists and identify the areas and where receptionists require targeted support. In doing so, we aim to increase job satisfaction of receptionists, improve practice efficiency and produce better outcomes for patients. Our work will be disseminated using conferences, workshops, trade journals, electronic media and through a series of publications in the peer reviewed literature. At the very least, our work will serve to prompt discussion on the clinical role of receptionists and assess the advantages of using value streams in conjunction with related tools for process improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Science.gov (United States)

    Phillips-Smith, Catherine; Jeong, Cheol-Heon; Healy, Robert M.; Dabek-Zlotorzynska, Ewa; Celo, Valbona; Brook, Jeffrey R.; Evans, Greg

    2017-08-01

    The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter) were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010-November 2012) at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013), hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow, water, and biota samples collected

  14. Improvements in Elimination of Loudspeaker Distortion in Acoustic Measurements

    DEFF Research Database (Denmark)

    Agerkvist, Finn T.; Torras Rosell, Antoni; McWalter, Richard Ian

    2015-01-01

    This paper investigates the influence of nonlinear components that contaminate the linear response of acoustic transducers, and presents improved methods for eliminating the influence of nonlinearities in acoustic measurements. The method is evaluated with pure sinusoidal signals as well as swept...

  15. Improved optimum condition for recovery and measurement of 210 ...

    African Journals Online (AJOL)

    The aim of this study was to determine the optimum conditions for deposition of 210Po and evaluate the accuracy and precision of the results for its determination in environmental samples. To improve the technique for measurement of polonium-210(210Po) in environmental samples. The optimization of five factors (volume ...

  16. Towards improving security measures in Nigeria University Libraries ...

    African Journals Online (AJOL)

    A questionnaire designed by the researchers titled “Towards Improving Security Measures in Nigerian University Libraries (TISMINUL)” was used to collect the needed data. The questionnaire was designed in two parts. Part one was to gather information on the size of collection, frequency of stock taking and book loss.

  17. Measures to improve the quality of hotel services

    Directory of Open Access Journals (Sweden)

    Anca MADAR

    2017-07-01

    Full Text Available This article aims to exemplify how, starting from the evaluation of customer satisfaction on service quality, the hotel units’ management, can apply different measures and strategies to improve it. To achieve the target, a marketing research survey is conducted based on a sample of 120 customers of Hotel „Kronwell” at the end of 2013. After analysing customer’ responses a series of measures have been taken to improve the quality of services offered by this hotel, then at the end of 2015 a new research was achieved, based on the same questionnaire. The results of this research highlight the increasing of customer satisfaction as a result of improving the quality of hotel services, supported by growth in net profit, turnover and decrease of employees’ number.

  18. 'Intelligent' triggering methodology for improved detectability of wavelength modulation diode laser absorption spectrometry applied to window-equipped graphite furnaces

    International Nuclear Information System (INIS)

    Gustafsson, Joergen; Axner, Ove

    2003-01-01

    The wavelength modulation-diode laser absorption spectrometry (WM-DLAS) technique experiences a limited detectability when window-equipped sample compartments are used because of multiple reflections between components in the optical system (so-called etalon effects). The problem is particularly severe when the technique is used with a window-equipped graphite furnace (GF) as atomizer since the heating of the furnace induces drifts of the thickness of the windows and thereby also of the background signals. This paper presents a new detection methodology for WM-DLAS applied to a window-equipped GF in which the influence of the background signals from the windows is significantly reduced. The new technique, which is based upon a finding that the WM-DLAS background signals from a window-equipped GF are reproducible over a considerable period of time, consists of a novel 'intelligent' triggering procedure in which the GF is triggered at a user-chosen 'position' in the reproducible drift-cycle of the WM-DLAS background signal. The new methodology makes also use of 'higher-than-normal' detection harmonics, i.e. 4f or 6f, since these previously have shown to have a higher signal-to-background ratio than 2f-detection when the background signals originates from thin etalons. The results show that this new combined background-drift-reducing methodology improves the limit of detection of the WM-DLAS technique used with a window-equipped GF by several orders of magnitude as compared to ordinary 2f-detection, resulting in a limit of detection for a window-equipped GF that is similar to that of an open GF

  19. Lipofection of insulin-producing RINm5F cells: methodological improvements.

    Science.gov (United States)

    Barbu, Andreea; Welsh, Nils

    2007-01-01

    Cationic lipid/DNA-complexes have been widely used as gene transfer vectors because they are less toxic and immunogenic than viral vectors. The aim of the present study was to improve and characterize lipofection of an insulin-producing cell line. We compared the transfection efficiency of seven commercially available lipid formulations (Lipotaxi, SuperFect, Fugene, TransFast, Dosper, GenePORTER and LipofectAMINE) by flow cytometry analysis of GFP-expression. In addition, we have determined the influences of centrifugation, serum and a nuclear localization signal peptide on the lipofection efficiency. We observed that two lipid formulations, GenePORTER and LipofectAMINE, were able to promote efficient gene transfer in RINm5F cells. However, GenePORTER exhibited the important advantage of being able to transfect cells in the presence of serum and with less cytotoxicity than LipofectAMINE. LipofectAMINE-induced RINm5F cell death could partially be counteracted by TPA, forskolin or fumonisin beta(1). Finally, both centrifugation and a nuclear localization signal peptide increased transfection efficiency.

  20. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    Science.gov (United States)

    Jonny; Nasution, Januar

    2013-06-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  1. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    International Nuclear Information System (INIS)

    Jonny; Nasution, Januar

    2013-01-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  2. Organisms for biofuel production: natural bioresources and methodologies for improving their biosynthetic potentials.

    Science.gov (United States)

    Hu, Guangrong; Ji, Shiqi; Yu, Yanchong; Wang, Shi'an; Zhou, Gongke; Li, Fuli

    2015-01-01

    In order to relieve the pressure of energy supply and environment contamination that humans are facing, there are now intensive worldwide efforts to explore natural bioresources for production of energy storage compounds, such as lipids, alcohols, hydrocarbons, and polysaccharides. Around the world, many plants have been evaluated and developed as feedstock for bioenergy production, among which several crops have successfully achieved industrialization. Microalgae are another group of photosynthetic autotroph of interest due to their superior growth rates, relatively high photosynthetic conversion efficiencies, and vast metabolic capabilities. Heterotrophic microorganisms, such as yeast and bacteria, can utilize carbohydrates from lignocellulosic biomass directly or after pretreatment and enzymatic hydrolysis to produce liquid biofuels such as ethanol and butanol. Although finding a suitable organism for biofuel production is not easy, many naturally occurring organisms with good traits have recently been obtained. This review mainly focuses on the new organism resources discovered in the last 5 years for production of transport fuels (biodiesel, gasoline, jet fuel, and alkanes) and hydrogen, and available methods to improve natural organisms as platforms for the production of biofuels.

  3. Measuring and improving the public perceptions on nuclear energy

    International Nuclear Information System (INIS)

    Choi, Young Sung

    2001-01-01

    The purpose of this paper is to measure the public's perception on risk and benefit of nuclear power and to find ways to improve the perceptions. Latent Class Analysis is adopted for the perception measures, which quantify people's perception and reveal the perception structure. The measures resulted from Latent Class Analysis show that women perceive risks to be more existent and benefits to be less than men do. Moreover there is a tendency that if education level is high, perceived risk is low and perceived benefit is high. The perception of risk and benefit also depends on different channels through which people get information about nuclear energy. Comparing seven different information channels, the most effective ways of communicating with people to improve the risk and benefit perception of nuclear energy are found to be the visit to nuclear plants and the education through the regular schooling. Information dissemination through mass media is only effective to the benefit perception

  4. IMPROVING SEMI-GLOBAL MATCHING: COST AGGREGATION AND CONFIDENCE MEASURE

    Directory of Open Access Journals (Sweden)

    P. d’Angelo

    2016-06-01

    Full Text Available Digital elevation models are one of the basic products that can be generated from remotely sensed imagery. The Semi Global Matching (SGM algorithm is a robust and practical algorithm for dense image matching. The connection between SGM and Belief Propagation was recently developed, and based on that improvements such as correction of over-counting the data term, and a new confidence measure have been proposed. Later the MGM algorithm has been proposed, it aims at improving the regularization step of SGM, but has only been evaluated on the Middlebury stereo benchmark so far. This paper evaluates these proposed improvements on the ISPRS satellite stereo benchmark, using a Pleiades Triplet and a Cartosat-1 Stereo pair. The over-counting correction slightly improves matching density, at the expense of adding a few outliers. The MGM cost aggregation shows leads to a slight increase of accuracy.

  5. Use of PFMEA methodology as a competitive advantage for the analysis of improvements in an experimental procedure

    Directory of Open Access Journals (Sweden)

    Fernando Coelho

    2015-12-01

    Full Text Available The methodology of Failure Modes and Effects Analysis (FMEA, utilized by industries to investigate potential failures, contributes to ensuring the robustness of the project and the manufacturing process, even before production starts. Thus, there is a reduced likelihood of errors, and a higher level of efficiency and effectiveness at high productivity. This occurs through the elimination or reduction of productive problems. In this context, this study is based on the structured application of PFMEA (Process Failure Mode Effects Analysis, associated with other quality tools, in a simulation of the assembly of an electro-pneumatic system. This study was performed at the Experimental Laboratory of the Botucatu Technology Faculty (FATEC, with the support of five undergraduate students from the Technology Industrial Production Course. The methodology applied contributed to the forecast of 24 potential failures and improvements opportunities, investigation of their causes, proving to be a standard that is applicable to any productive process with a gain in efficiency and effectiveness. Therefore, the final strategy was to evaluate and minimize the potential failures, to reduce production costs and to increase the performance of the process.

  6. Nutrients interaction investigation to improve Monascus purpureus FTC5391 growth rate using Response Surface Methodology and Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Mohamad, R.

    2013-01-01

    Full Text Available Aims: Two vital factors, certain environmental conditions and nutrients as a source of energy are entailed for successful growth and reproduction of microorganisms. Manipulation of nutritional requirement is the simplest and most effectual strategy to stimulate and enhance the activity of microorganisms. Methodology and Results: In this study, response surface methodology (RSM and artificial neural network (ANN were employed to optimize the carbon and nitrogen sources in order to improve growth rate of Monascus purpureus FTC5391,a new local isolate. The best models for optimization of growth rate were a multilayer full feed-forward incremental back propagation network, and a modified response surface model using backward elimination. The optimum condition for cell mass production was: sucrose 2.5%, yeast extract 0.045%, casamino acid 0.275%, sodium nitrate 0.48%, potato starch 0.045%, dextrose 1%, potassium nitrate 0.57%. The experimental cell mass production using this optimal condition was 21 mg/plate/12days, which was 2.2-fold higher than the standard condition (sucrose 5%, yeast extract 0.15%, casamino acid 0.25%, sodium nitrate 0.3%, potato starch 0.2%, dextrose 1%, potassium nitrate 0.3%. Conclusion, significance and impact of study: The results of RSM and ANN showed that all carbon and nitrogen sources tested had significant effect on growth rate (P-value < 0.05. In addition the use of RSM and ANN alongside each other provided a proper growth prediction model.

  7. Methodology for determining the effectiveness of implementing innovations and proposals for improving production in oil industry branches

    Energy Technology Data Exchange (ETDEWEB)

    Luzin, V I; Logachev, V M

    1980-01-01

    An appropriate technology is applied with a specialized method for determining the economic effectiveness of new technological inventions and efficiency in the national economy from 1977 until the present. An analogous project has also been developed in order to consider and examine specific elements of the oil industry. This project incorporates a specialized methodology to examine the principle factors behind oil production-extraction and associated with scientific-technical progress. This approach applies technical and efficiency proposals and considered new inventions during the planning stages in order to calculate the economic effectiveness of these new inputs. The principle methodological premise for the calculation of annual economic effectiveness during the planning stages is based upon economic stimulation inspired by new inventions and efficiency experts. A formula is provided for conducting such calculations. Examples are provided to illustrate how the annual economic effectiveness of a depulsator (a device used to improve the quality of separation for oil-gas mixtures while at the same time reducing oil loss) is calculated. The authors offer a detailed examination of methods used to accurately reflect the economic effectiveness of new technologies within the spheres of planning and calculating indicators for enterprises and production organization in the oil industry, both by individual branch, and for the entire industry.

  8. Methodology of the Auditing Measures to Civil Airport Security and Protection

    Directory of Open Access Journals (Sweden)

    Ján Kolesár

    2016-10-01

    Full Text Available Airports similarly to other companies are certified in compliance with the International Standardization Organization (ISO standards of products and services (series of ISO 9000 Standards regarding quality management, to coordinate the technical side of standardizatioon and normalization at an international scale. In order for the airports to meet the norms and the certification requirements as by the ISO they are liable to undergo strict audits of quality, as a rule, conducted by an independent auditing organization. Focus of the audits is primarily on airport operation economics and security. The article is an analysis into the methodology of the airport security audit processes and activities. Within the framework of planning, the sequence of steps is described in line with the principles and procedures of the Security Management System (SMS and starndards established by the International Standardization Organization (ISO. The methodology of conducting airport security audit is developed in compliance with the national programme and international legislation standards (Annex 17 applicable to protection of civil aviation against acts of unlawful interference.

  9. Reassessment of pH reference values with improved methodology for the evaluation of ionic strength

    International Nuclear Information System (INIS)

    Lito, M.J. Guiomar H.M.; Camoes, M. Filomena G.F.C.

    2005-01-01

    The conflict between pH as empirical number in routine control and the pH value regarded as conveying some information concerning the effective concentration or activity of hydrogen ions, a H , has caused much confusion. There are, however, reasons to conclude that the overwhelming amount of thermodynamic data is not sufficiently accurate--either due to ignorance of metrological concepts or due to insufficiently specified measurement processes of fundamental chemical quantities pH. The commonly used seven reference buffer solutions to which primary pH values have been conventional assigned, represent a selection out of a more extensive list, recommended by NBS (now NIST) in 1962. From then onwards conventions concerning the Debye-Hueckel model of electrolyte solutions and ionic strength have been revised and the pH(S) values reassessed in conformity but only for these seven reference buffer solutions. The others have, so far remained unchanged, locking harmonisation of the conventionally assigned pH(S) values. In this work, ionic strength is calculated through complete equations derived from the acidity constants. Concentrations of the various species involved in the conventional assignment of pH and their corresponding activity coefficients are therefore, more rigorously known. The process proves particularly useful for poliprotic acids with overlapping acidity constants, where the ratio is less than 10 3 . As a consequence, conventionally assigned pH values of reference buffer solutions are recalculated and corrections are introduced as appropriate

  10. Measures to Improve Diagnostic Safety in Clinical Practice.

    Science.gov (United States)

    Singh, Hardeep; Graber, Mark L; Hofer, Timothy P

    2016-10-20

    Timely and accurate diagnosis is foundational to good clinical practice and an essential first step to achieving optimal patient outcomes. However, a recent Institute of Medicine report concluded that most of us will experience at least one diagnostic error in our lifetime. The report argues for efforts to improve the reliability of the diagnostic process through better measurement of diagnostic performance. The diagnostic process is a dynamic team-based activity that involves uncertainty, plays out over time, and requires effective communication and collaboration among multiple clinicians, diagnostic services, and the patient. Thus, it poses special challenges for measurement. In this paper, we discuss how the need to develop measures to improve diagnostic performance could move forward at a time when the scientific foundation needed to inform measurement is still evolving. We highlight challenges and opportunities for developing potential measures of "diagnostic safety" related to clinical diagnostic errors and associated preventable diagnostic harm. In doing so, we propose a starter set of measurement concepts for initial consideration that seem reasonably related to diagnostic safety and call for these to be studied and further refined. This would enable safe diagnosis to become an organizational priority and facilitate quality improvement. Health-care systems should consider measurement and evaluation of diagnostic performance as essential to timely and accurate diagnosis and to the reduction of preventable diagnostic harm.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

  11. TL sensitivity constancy of quartz upon UV+(β,γ) irradiation cycle: An improvement on dating methodology

    International Nuclear Information System (INIS)

    Souza, J.H.

    1985-01-01

    Thermoluminescence (TL) sensivitivy of natural quartz (beaches, paleobeaches and fixed dunes) to Beta and Gamma rays has been studied in a temperature range between 250-400 0 C, before and after bleaching by solar irradiation. A first TL glow growth curve was obtained through Beta irradiation of 'as found' samples and a second glow growth curve was obtained by bleaching through solar irradiation the TL acquired either through an excitation Gamma dose or through environmental radiation to finally re-irradiate the samples under Beta rays. Experimental results showing the constancy of sensitivity, for doses of until about 10K rads, are the basis for a proposal to improve TL dating methodology expanding its present limits. (Author) [pt

  12. Improved automatic filtering methodology for an optimal pharmacokinetic modelling of DCE-MR images of the prostate

    Energy Technology Data Exchange (ETDEWEB)

    Vazquez Martinez, V.; Bosch Roig, I.; Sanz Requena, R.

    2016-07-01

    In Dynamic Contrast-Enhanced Magnetic Resonance (DCEMR) studies with high temporal resolution, images are quite noisy due to the complicate balance between temporal and spatial resolution. For this reason, the temporal curves extracted from the images present remarkable noise levels and, because of that, the pharmacokinetic parameters calculated by least squares fitting from the curves and the arterial phase (a useful marker in tumour diagnosis which appears in curves with high arterial contribution) are affected. In order to solve these limitations, an automatic filtering method was developed by our group. In this work, an advanced automatic filtering methodology is presented to further improve noise reduction of the temporal curves in order to obtain more accurate kinetic parameters and a proper modelling of the arterial phase. (Author)

  13. Effectiveness of ventilation improvements as a protective measure against radon

    International Nuclear Information System (INIS)

    Hoving, P.; Arvela, H.

    1993-01-01

    Radon reduction rates for ventilation improvement measures vary considerably. In 70% of the cases studied, further mitigation is needed to reach a level of 400 Bq/m 3 . Ventilation measures in crawl spaces and basements have resulted in reduction rates of up to 90%, though more typically 30-70%. Installing new mechanical systems in dwellings has resulted in 20-80% reduction rates. If fan use or fan efficiency is increased, radon levels can be reduced as much as when new systems are installed. Increasing fresh-air supply through vents or window gaps reduces radon concentrations 10-40%. Low ventilation rates, measured after mitigation using the passive per fluorocarbon tracer gas method, seem to be accompanied by also low radon reduction rates. Multiple zone tracer gas measurements were conducted in order to reveal radon entry from the soil and radon transport between zones. (orig.). (3 refs., 3 figs., 2 tabs.)

  14. Addressing the “It Is Just Placebo” Pitfall in CAM: Methodology of a Project to Develop Patient-Reported Measures of Nonspecific Factors in Healing

    Directory of Open Access Journals (Sweden)

    Carol M. Greco

    2013-01-01

    Full Text Available CAM therapies are often dismissed as “no better than placebo;” however, this belief may be overcome through careful analysis of nonspecific factors in healing. To improve trial methodology, we propose that CAM (and conventional RCTs should evaluate and adjust for the effects of intrapersonal, interpersonal, and environmental factors on outcomes. However, measurement of these is challenging, and there are no brief, precise instruments that are suitable for widespread use in trials and clinical settings. This paper describes the methodology of a project to develop a set of patient-reported instruments that will quantify the nonspecific or “placebo” effects that are in fact specific and active ingredients in healing. The project uses the rigorous instrument-development methodology of the NIH-PROMIS initiative. The methods include (1 integration of patients’ and clinicians’ opinions with existing literature; (2 development of relevant items; (3 calibration of items on large samples; (4 classical test theory and modern psychometric methods to select the most useful items; (5 development of computerized adaptive tests (CATs that maximize information while minimizing patient burden; and (6 initial validation studies. The instruments will have the potential to revolutionize clinical trials in both CAM and conventional medicine through quantifying contextual factors that contribute to healing.

  15. Methodological considerations for researchers and practitioners using pedometers to measure physical (ambulatory) activity.

    Science.gov (United States)

    Tudor-Locke, C E; Myers, A M

    2001-03-01

    Researchers and practitioners require guidelines for using electronic pedometers to objectively quantify physical activity (specifically ambulatory activity) for research and surveillance as well as clinical and program applications. Methodological considerations include choice of metric and length of monitoring frame as well as different data recording and collection procedures. A systematic review of 32 empirical studies suggests we can expect 12,000-16,000 steps/day for 8-10-year-old children (lower for girls than boys); 7,000-13,000 steps/day for relatively healthy, younger adults (lower for women than men); 6,000-8,500 steps/day for healthy older adults; and 3,500-5,500 steps/day for individuals living with disabilities and chronic illnesses. These preliminary recommendations should be modified and refined, as evidence and experience using pedometers accumulates.

  16. A Novel Methodology for Measurements of an LED's Heat Dissipation Factor

    Science.gov (United States)

    Jou, R.-Y.; Haung, J.-H.

    2015-12-01

    Heat generation is an inevitable byproduct with high-power light-emitting diode (LED) lighting. The increase in junction temperature that accompanies the heat generation sharply degrades the optical output of the LED and has a significant negative influence on the reliability and durability of the LED. For these reasons, the heat dissipation factor, Kh, is an important factor in modeling and thermal design of LED installations. In this study, a methodology is proposed and experiments are conducted to determine LED heat dissipation factors. Experiments are conducted for two different brands of LED. The average heat dissipation factor of the Edixeon LED is 0.69, and is 0.60 for the OSRAM LED. By using the developed test method and comparing the results to the calculated luminous fluxes using theoretical equations, the interdependence of optical, electrical, and thermal powers can be predicted with a reasonable accuracy. The difference between the theoretical and experimental values is less than 9 %.

  17. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    Misra, M.K.; Menon, Saritha P.; Thirugnana Murthy, D.

    2013-01-01

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  18. Improved Measurement of Electron-antineutrino Disappearance at Daya Bay

    International Nuclear Information System (INIS)

    Dwyer, D.A.

    2013-01-01

    With 2.5× the previously reported exposure, the Daya Bay experiment has improved the measurement of the neutrino mixing parameter sin 2 2θ 13 =0.089±0.010(stat)±0.005(syst). Reactor anti-neutrinos were produced by six 2.9 GW th commercial power reactors, and measured by six 20-ton target-mass detectors of identical design. A total of 234,217 anti-neutrino candidates were detected in 127 days of exposure. An anti-neutrino rate of 0.944±0.007(stat)±0.003(syst) was measured by three detectors at a flux-weighted average distance of1648 m from the reactors, relative to two detectors at 470 m and one detector at 576 m. Detector design and depth underground limited the background to 5±0.3% (far detectors) and 2±0.2% (near detectors) of the candidate signals. The improved precision confirms the initial measurement of reactor anti-neutrino disappearance, and continues to be the most precise measurement of θ 13

  19. Improved Measurement of Electron-antineutrino Disappearance at Daya Bay

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, D.A. [Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA (United States); Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2013-02-15

    With 2.5× the previously reported exposure, the Daya Bay experiment has improved the measurement of the neutrino mixing parameter sin{sup 2}2θ{sub 13}=0.089±0.010(stat)±0.005(syst). Reactor anti-neutrinos were produced by six 2.9 GW{sub th} commercial power reactors, and measured by six 20-ton target-mass detectors of identical design. A total of 234,217 anti-neutrino candidates were detected in 127 days of exposure. An anti-neutrino rate of 0.944±0.007(stat)±0.003(syst) was measured by three detectors at a flux-weighted average distance of1648 m from the reactors, relative to two detectors at 470 m and one detector at 576 m. Detector design and depth underground limited the background to 5±0.3% (far detectors) and 2±0.2% (near detectors) of the candidate signals. The improved precision confirms the initial measurement of reactor anti-neutrino disappearance, and continues to be the most precise measurement of θ{sub 13}.

  20. Integrating patient satisfaction into performance measurement to meet improvement challenges.

    Science.gov (United States)

    Smith, J E; Fisher, D L; Endorf-Olson, J J

    2000-05-01

    A Value Compass has been proposed to guide health care data collection. The "compass corners" represent the four types of data needed to meet health care customer expectations: appropriate clinical outcomes, improved functional status, patient satisfaction, and appropriate costs. Collection of all four types of data is necessary to select processes in need of improvement, guide improvement teams, and monitor the success of improvement efforts. INTEGRATED DATA AT BRYANLGH: BryanLGH Medical Center in Lincoln, Nebraska, has adopted multiple performance measurement systems to collect clinical outcome, financial, and patient satisfaction data into integrated databases. Data integration allows quality professionals at BryanLGH to identify quality issues from multiple perspectives and track the interrelated effects of improvement efforts. A CASE EXAMPLE: Data from the fourth quarter of 1997 indicated the need to improve processes related to cesarean section (C-section) deliveries. An interdisciplinary team was formed, which focused on educating nurses, physicians, and the community about labor support measures. Physicians were given their own rates of C-section deliveries. The C-section rate decreased from 27% to 19%, but per-case cost increased. PickerPLUS+ results indicated that BryanLGH obstetric patients reported fewer problems with receiving information than the Picker norm, but they reported more problems with the involvement of family members and friends. The data collected so far have indicated a decrease in the C-section rate and a need to continue to work on cost and psychosocial issues. A complete analysis of results was facilitated by integrated performance management systems. Successes have been easily tracked over time, and the need for further work on related processes has been clearly identified.

  1. Standardization of the methodology used for fuel pressure drop evaluation to improve hydraulic calculation of heterogeneous cores

    International Nuclear Information System (INIS)

    Le Borgne, E.; Mattei, A.; Rome, M.; Rodriguez, J.M.

    2004-01-01

    The determination of hydraulic characteristics for fuel subassembly components is dependent on the hypotheses and the methodology considered. The results of hydraulic compatibility calculations using input data from different sources may thus be difficult to analyse, and their reliability will consequently be reduced. Electricite de France (EDF) and Commissariat a l'Energie Atomique (CEA) have initiated a common program aiming at controlling the consequences of such a situation, increasing the reliability of the values used in the hydraulic compatibility calculations, and proposing a standardization of the operating procedures. In a first step, this program is based on the measurements performed in the CEA HERMES P facility. Extension of this program is expected to the equivalent experimental facilities for which sufficient information will be made available. (author)

  2. Automated microscopic characterization of metallic ores with image analysis: a key to improve ore processing. I: test of the methodology

    International Nuclear Information System (INIS)

    Berrezueta, E.; Castroviejo, R.

    2007-01-01

    Ore microscopy has traditionally been an important support to control ore processing, but the volume of present day processes is beyond the reach of human operators. Automation is therefore compulsory, but its development through digital image analysis, DIA, is limited by various problems, such as the similarity in reflectance values of some important ores, their anisotropism, and the performance of instruments and methods. The results presented show that automated identification and quantification by DIA are possible through multiband (RGB) determinations with a research 3CCD video camera on reflected light microscope. These results were obtained by systematic measurement of selected ores accounting for most of the industrial applications. Polarized light is avoided, so the effects of anisotropism can be neglected. Quality control at various stages and statistical analysis are important, as is the application of complementary criteria (e.g. metallogenetic). The sequential methodology is described and shown through practical examples. (Author)

  3. Measuring subjective meaning structures by the laddering method: Theoretical considerations and methodological problems

    DEFF Research Database (Denmark)

    Grunert, Klaus G.; Grunert, Suzanne C.

    1995-01-01

    Starting from a general model of measuring cognitive structures for predicting consumer behaviour, we discuss laddering as a possible method to obtain estimates of consumption-relevant cognitive structures which will have predictive validity. Four criteria for valid measurement are derived and ap...

  4. Translating patient reported outcome measures: methodological issues explored using cognitive interviewing with three rheumatoid arthritis measures in six European languages

    NARCIS (Netherlands)

    Hewlett, Sarah E.; Nicklin, Joanna; Bode, Christina; Carmona, Loretto; Dures, Emma; Engelbrecht, Matthias; Hagel, Sofia; Kirwan, John R.; Molto, Anna; Redondo, Marta; Gossec, Laure

    2016-01-01

    Objective. Cross-cultural translation of patient-reported outcome measures (PROMs) is a lengthy process, often performed professionally. Cognitive interviewing assesses patient comprehension of PROMs. The objective was to evaluate the usefulness of cognitive interviewing to assess translations and

  5. Methodological challenges in measurements of functional ability in gerontological research. A review

    DEFF Research Database (Denmark)

    Avlund, Kirsten

    1997-01-01

    This article addresses two important challenges in the measurement of functional ability in gerontological research: the first challenge is to connect measurements to a theoretical frame of reference which enhances our understanding and interpretation of the collected data; the second relates...... procedure, validity, discriminatory power, and responsiveness. In measures of functional ability it is recommended: 1) always to consider the theoretical frame of reference as part of the validation process (e.g., the theory of "The Disablement Process"; 2) always to assess whether the included activities...

  6. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    Science.gov (United States)

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  7. From theory to 'measurement' in complex interventions: Methodological lessons from the development of an e-health normalisation instrument

    Directory of Open Access Journals (Sweden)

    Finch Tracy L

    2012-05-01

    Full Text Available Abstract Background Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1 describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2 identify key issues and methodological challenges for advancing work in this field. Methods A 30-item instrument (Technology Adoption Readiness Scale (TARS for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT. NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice was used by health care professionals. Results The developed instrument was pre-tested in two professional samples (N = 46; N = 231. Ratings of items representing normalisation ‘processes’ were significantly related to staff members’ perceptions of whether or not e-health had become ‘routine’. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. Conclusions To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1 greater attention to underlying theoretical assumptions and extent of translation work required; (2 the need for appropriate but flexible approaches to outcomes

  8. An Improvement in Biodiesel Production from Waste Cooking Oil by Applying Thought Multi-Response Surface Methodology Using Desirability Functions

    Directory of Open Access Journals (Sweden)

    Marina Corral Bobadilla

    2017-01-01

    Full Text Available The exhaustion of natural resources has increased petroleum prices and the environmental impact of oil has stimulated the search for an alternative source of energy such as biodiesel. Waste cooking oil is a potential replacement for vegetable oils in the production of biodiesel. Biodiesel is synthesized by direct transesterification of vegetable oils, which is controlled by several inputs or process variables, including the dosage of catalyst, process temperature, mixing speed, mixing time, humidity and impurities of waste cooking oil that was studied in this case. Yield, turbidity, density, viscosity and higher heating value are considered as outputs. This paper used multi-response surface methodology (MRS with desirability functions to find the best combination of input variables used in the transesterification reactions to improve the production of biodiesel. In this case, several biodiesel optimization scenarios have been proposed. They are based on a desire to improve the biodiesel yield and the higher heating value, while decreasing the viscosity, density and turbidity. The results demonstrated that, although waste cooking oil was collected from various sources, the dosage of catalyst is one of the most important variables in the yield of biodiesel production, whereas the viscosity obtained was similar in all samples of the biodiesel that was studied.

  9. Using Measured Plane-of-Array Data Directly in Photovoltaic Modeling: Methodology and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine; Freestate, David; Riley, Cameron; Hobbs, William

    2016-11-01

    Measured plane-of-array (POA) irradiance may provide a lower-cost alternative to standard irradiance component data for photovoltaic (PV) system performance modeling without loss of accuracy. Previous work has shown that transposition models typically used by PV models to calculate POA irradiance from horizontal data introduce error into the POA irradiance estimates, and that measured POA data can correlate better to measured performance data. However, popular PV modeling tools historically have not directly used input POA data. This paper introduces a new capability in NREL's System Advisor Model (SAM) to directly use POA data in PV modeling, and compares SAM results from both POA irradiance and irradiance components inputs against measured performance data for eight operating PV systems.

  10. Feasibility, strategy, methodology, and analysis of probe measurements in plasma under high gas pressure

    Science.gov (United States)

    Demidov, V. I.; Koepke, M. E.; Kurlyandskaya, I. P.; Malkov, M. A.

    2018-02-01

    This paper reviews existing theories for interpreting probe measurements of electron distribution functions (EDF) at high gas pressure when collisions of electrons with atoms and/or molecules near the probe are pervasive. An explanation of whether or not the measurements are realizable and reliable, an enumeration of the most common sources of measurement error, and an outline of proper probe-experiment design elements that inherently limit or avoid error is presented. Additionally, we describe recent expanded plasma-condition compatibility for EDF measurement, including in applications of large wall probe plasma diagnostics. This summary of the authors’ experiences gained over decades of practicing and developing probe diagnostics is intended to inform, guide, suggest, and detail the advantages and disadvantages of probe application in plasma research.

  11. Accuracy requirements on operational measurements in nuclear power plants with regard to balance methodology

    International Nuclear Information System (INIS)

    Holecek, C.

    1986-01-01

    Accurate in-service measurement is necessary for power balancing of nuclear power plants, i.e., the determination of fuel consumption, electric power generation, heat delivery and the degree of fuel power utilization. The only possible method of determining the input of total consumed energy from the fuel is the balance of the primary coolant circuit. This is because for the purposes of power balancing it is not possible to measure the amount of power generated from nuclear fuel. Relations are presented for the calculation of basic indices of the power balance. It is stated that for the purposes of power balancing and analyses the precision of measuring instrument at the input and output of balancing circuits is of primary importance, followed by the precision of measuring instruments inside balancing circuits and meters of auxiliary parameters. (Z.M.). 7 refs., 1 tab

  12. Pilot testing a methodology to measure the marginal increase in economic impact of rural tourism sites

    Science.gov (United States)

    April Evans; Hans Vogelsong

    2008-01-01

    Rural tourism is a rapidly expanding industry which holds some promise of improving the economy in small towns and farming regions. However, rural communities have limited funding available for promotional efforts. To understand if limited funds are effective in producing the desired economic impacts, it is important that rural communities evaluate their promotional...

  13. Inclusive Assessment: Toward a Socially-Just Methodology for Measuring Institution-Wide Engagement

    Science.gov (United States)

    Getto, Guiseppe; McCunney, Dennis

    2015-01-01

    Institutions are increasingly being called upon to collect large amounts of data to demonstrate community impact. At institutions with strong and wide-reaching public engagement/service missions, this expectation is even greater--both for quality improvement and for demonstrating regional transformation. Despite these expectations, the…

  14. Contact Thermocouple Methodology and Evaluation for Temperature Measurement in the Laboratory

    Science.gov (United States)

    Brewer, Ethan J.; Pawlik, Ralph J.; Krause, David L.

    2013-01-01

    Laboratory testing of advanced aerospace components very often requires highly accurate temperature measurement and control devices, as well as methods to precisely analyze and predict the performance of such components. Analysis of test articles depends on accurate measurements of temperature across the specimen. Where possible, this task is accomplished using many thermocouples welded directly to the test specimen, which can produce results with great precision. However, it is known that thermocouple spot welds can initiate deleterious cracks in some materials, prohibiting the use of welded thermocouples. Such is the case for the nickel-based superalloy MarM-247, which is used in the high temperature, high pressure heater heads for the Advanced Stirling Converter component of the Advanced Stirling Radioisotope Generator space power system. To overcome this limitation, a method was developed that uses small diameter contact thermocouples to measure the temperature of heater head test articles with the same level of accuracy as welded thermocouples. This paper includes a brief introduction and a background describing the circumstances that compelled the development of the contact thermocouple measurement method. Next, the paper describes studies performed on contact thermocouple readings to determine the accuracy of results. It continues on to describe in detail the developed measurement method and the evaluation of results produced. A further study that evaluates the performance of different measurement output devices is also described. Finally, a brief conclusion and summary of results is provided.

  15. Comparison of efficiency of distance measurement methodologies in mango (Mangifera indica) progenies based on physicochemical descriptors.

    Science.gov (United States)

    Alves, E O S; Cerqueira-Silva, C B M; Souza, A M; Santos, C A F; Lima Neto, F P; Corrêa, R X

    2012-03-14

    We investigated seven distance measures in a set of observations of physicochemical variables of mango (Mangifera indica) submitted to multivariate analyses (distance, projection and grouping). To estimate the distance measurements, five mango progeny (total of 25 genotypes) were analyzed, using six fruit physicochemical descriptors (fruit weight, equatorial diameter, longitudinal diameter, total soluble solids in °Brix, total titratable acidity, and pH). The distance measurements were compared by the Spearman correlation test, projection in two-dimensional space and grouping efficiency. The Spearman correlation coefficients between the seven distance measurements were, except for the Mahalanobis' generalized distance (0.41 ≤ rs ≤ 0.63), high and significant (rs ≥ 0.91; P < 0.001). Regardless of the origin of the distance matrix, the unweighted pair group method with arithmetic mean grouping method proved to be the most adequate. The various distance measurements and grouping methods gave different values for distortion (-116.5 ≤ D ≤ 74.5), cophenetic correlation (0.26 ≤ rc ≤ 0.76) and stress (-1.9 ≤ S ≤ 58.9). Choice of distance measurement and analysis methods influence the.

  16. Methodology and measurement of radiation interception by quantum sensor of the oil palm plantation

    Directory of Open Access Journals (Sweden)

    Johari Endan

    2005-09-01

    Full Text Available Interception of light by a canopy is a fundamental requirement for crop growth and is important for biomass production and plant growth modeling. Solar radiation is an important parameter for photosynthesis and evapotranspiration. These two phenomena are dependent not only on the intensity of radiation but also on the distribution of intercepted radiation within the canopy. In this study, two operational methods for estimating the amount of photosynthetically active radiation (PAR intercepted by a canopy of the oil palm are presented. LICOR radiation sensors, model LI-190SA and model LI-191SA were used for photosynthetically active radiation (PAR measurement above and below the canopy. We developed two methods, namely "Triangular" method and "Circular" method for PAR measurement. Results show that both methods were suitable for oil palm PAR measurement. The triangular method is recommended for PAR measurements with respect to the whole plantation and the circular method is recommended for specific purposes, such as growth analysis or growth modeling of the oil palm. However, practical considerations such as equipment availability, purpose of the measurement, age of the palm, and the number of measuring points to be sampled should be taken into account in the selection of a suitable method for a particular study. The results indicate that the interception of radiation was affected by spatial variation, and the radiation transmission decreased towards the frond tips.

  17. An improved in situ measurement of offset phase shift towards quantitative damping-measurement with AFM

    International Nuclear Information System (INIS)

    Minary-Jolandan, Majid; Yu Minfeng

    2008-01-01

    An improved approach is introduced in damping measurement with atomic force microscope (AFM) for the in situ measurement of the offset phase shift needed for determining the intrinsic mechanical damping in nanoscale materials. The offset phase shift is defined and measured at a point of zero contact force according to the deflection part of the AFM force plot. It is shown that such defined offset phase shift is independent of the type of sample material, varied from hard to relatively soft materials in this study. This improved approach allows the self-calibrated and quantitative damping measurement with AFM. The ability of dynamic mechanical analysis for the measurement of damping in isolated one-dimensional nanostructures, e.g. individual multiwalled carbon nanotubes, was demonstrated

  18. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Directory of Open Access Journals (Sweden)

    C. Phillips-Smith

    2017-08-01

    Full Text Available The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010–November 2012 at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013, hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow

  19. Efficient solution methodology for calibrating the hemodynamic model using functional Magnetic Resonance Imaging (fMRI) measurements

    KAUST Repository

    Zambri, Brian

    2015-11-05

    Our aim is to propose a numerical strategy for retrieving accurately and efficiently the biophysiological parameters as well as the external stimulus characteristics corresponding to the hemodynamic mathematical model that describes changes in blood flow and blood oxygenation during brain activation. The proposed method employs the TNM-CKF method developed in [1], but in a prediction/correction framework. We present numerical results using both real and synthetic functional Magnetic Resonance Imaging (fMRI) measurements to highlight the performance characteristics of this computational methodology. © 2015 IEEE.

  20. Efficient solution methodology for calibrating the hemodynamic model using functional Magnetic Resonance Imaging (fMRI) measurements

    KAUST Repository

    Zambri, Brian; Djellouli, Rabia; Laleg-Kirati, Taous-Meriem

    2015-01-01

    Our aim is to propose a numerical strategy for retrieving accurately and efficiently the biophysiological parameters as well as the external stimulus characteristics corresponding to the hemodynamic mathematical model that describes changes in blood flow and blood oxygenation during brain activation. The proposed method employs the TNM-CKF method developed in [1], but in a prediction/correction framework. We present numerical results using both real and synthetic functional Magnetic Resonance Imaging (fMRI) measurements to highlight the performance characteristics of this computational methodology. © 2015 IEEE.

  1. Metabolic tumour volumes measured at staging in lymphoma: methodological evaluation on phantom experiments and patients

    International Nuclear Information System (INIS)

    Meignan, Michel; Sasanelli, Myriam; Itti, Emmanuel; Casasnovas, Rene Olivier; Luminari, Stefano; Fioroni, Federica; Coriani, Chiara; Masset, Helene; Gobbi, Paolo G.; Merli, Francesco; Versari, Annibale

    2014-01-01

    The presence of a bulky tumour at staging on CT is an independent prognostic factor in malignant lymphomas. However, its prognostic value is limited in diffuse disease. Total metabolic tumour volume (TMTV) determined on 18 F-FDG PET/CT could give a better evaluation of the total tumour burden and may help patient stratification. Different methods of TMTV measurement established in phantoms simulating lymphoma tumours were investigated and validated in 40 patients with Hodgkin lymphoma and diffuse large B-cell lymphoma. Data were processed by two nuclear medicine physicians in Reggio Emilia and Creteil. Nineteen phantoms filled with 18 F-saline were scanned; these comprised spherical or irregular volumes from 0.5 to 650 cm 3 with tumour-to-background ratios from 1.65 to 40. Volumes were measured with different SUVmax thresholds. In patients, TMTV was measured on PET at staging by two methods: volumes of individual lesions were measured using a fixed 41 % SUVmax threshold (TMTV 41 ) and a variable visually adjusted SUVmax threshold (TMTV var ). In phantoms, the 41 % threshold gave the best concordance between measured and actual volumes. Interobserver agreement was almost perfect. In patients, the agreement between the reviewers for TMTV 41 measurement was substantial (ρ c = 0.986, CI 0.97 - 0.99) and the difference between the means was not significant (212 ± 218 cm 3 for Creteil vs. 206 ± 219 cm 3 for Reggio Emilia, P = 0.65). By contrast the agreement was poor for TMTV var . There was a significant direct correlation between TMTV 41 and normalized LDH (r = 0.652, CI 0.42 - 0.8, P 41 , but high TMTV 41 could be found in patients with stage 1/2 or nonbulky tumour. Measurement of baseline TMTV in lymphoma using a fixed 41% SUVmax threshold is reproducible and correlates with the other parameters for tumour mass evaluation. It should be evaluated in prospective studies. (orig.)

  2. Improving Oncology Quality Measurement in Accountable Care: Filling Gaps with Cross-Cutting Measures.

    Science.gov (United States)

    Valuck, Tom; Blaisdell, David; Dugan, Donna P; Westrich, Kimberly; Dubois, Robert W; Miller, Robert S; McClellan, Mark

    2017-02-01

    Payment for health care services, including oncology services, is shifting from volume-based fee-for-service to value-based accountable care. The objective of accountable care is to support providers with flexibility and resources to reform care delivery, accompanied by accountability for maintaining or improving outcomes while lowering costs. These changes depend on health care payers, systems, physicians, and patients having meaningful measures to assess care delivery and outcomes and to balance financial incentives for lowering costs while providing greater value. Gaps in accountable care measure sets may cause missed signals of problems in care and missed opportunities for improvement. Measures to balance financial incentives may be particularly important for oncology, where high cost and increasingly targeted diagnostics and therapeutics intersect with the highly complex and heterogeneous needs and preferences of cancer patients. Moreover, the concept of value in cancer care, defined as the measure of outcomes achieved per costs incurred, is rarely incorporated into performance measurement. This article analyzes gaps in oncology measures in accountable care, discusses challenging measurement issues, and offers strategies for improving oncology measurement. Discern Health analyzed gaps in accountable care measure sets for 10 cancer conditions that were selected based on incidence and prevalence; impact on cost and mortality; a diverse range of high-cost diagnostic procedures and treatment modalities (e.g., genomic tumor testing, molecularly targeted therapies, and stereotactic radiotherapy); and disparities or performance gaps in patient care. We identified gaps by comparing accountable care set measures with high-priority measurement opportunities derived from practice guidelines developed by the National Comprehensive Cancer Network and other oncology specialty societies. We found significant gaps in accountable care measure sets across all 10 conditions. For

  3. Applying Lean Six Sigma methodologies to improve efficiency, timeliness of care, and quality of care in an internal medicine residency clinic.

    Science.gov (United States)

    Fischman, Daniel

    2010-01-01

    Patients' connectedness to their providers has been shown to influence the success of preventive health and disease management programs. Lean Six Sigma methodologies were employed to study workflow processes, patient-physician familiarity, and appointment compliance to improve continuity of care in an internal medicine residency clinic. We used a rapid-cycle test to evaluate proposed improvements to the baseline-identified factors impeding efficient clinic visits. Time-study, no-show, and patient-physician familiarity data were collected to evaluate the effect of interventions to improve clinic efficiency and continuity of medical care. Forty-seven patients were seen in each of the intervention and control groups. The wait duration between the end of triage and the resident-patient encounter was statistically shorter for the intervention group. Trends toward shorter wait times for medical assistant triage and total encounter were also seen in the intervention group. On all measures of connectedness, both the physicians and patients in the intervention group showed a statistically significant increased familiarity with each other. This study shows that incremental changes in workflow processes in a residency clinic can have a significant impact on practice efficiency and adherence to scheduled visits for preventive health care and chronic disease management. This project used a structured "Plan-Do-Study-Act" approach.

  4. Improved FPGA controlled artificial vascular system for plethysmographic measurements

    Directory of Open Access Journals (Sweden)

    Laqua Daniel

    2016-09-01

    Full Text Available The fetal oxygen saturation is an important parameter to determine the health status of a fetus, which is until now mostly acquired invasively. The transabdominal, fetal pulse oximetry is a promising approach to measure this non-invasively and continuously. The fetal pulse curve has to be extracted from the mixed signal of mother and fetus to determine its oxygen saturation. For this purpose efficient algorithms are necessary, which have to be evaluated under constant and reproducable test conditions. This paper presents the improved version of a phantom which can generate artificial pulse waves in a synthetic tissue phantom. The tissue phantom consists of several layers that mimic the different optical properties of the fetal and maternal tissue layers. Additionally an artificial vascular system and a dome, which mimics the bending of the belly of a pregnant woman, are incorporated. To obtain data on the pulse waves, several measurement methods are included, to help understand the behavior of the signals gained from the pulse waves. Besides pressure sensors and a transmissive method we integrated a capacitive approach, that makes use of the so called “Pin Oscillator” method. Apart from the enhancements in the tissue phantom and the measurements, we also improved the used blood substitute, which reproduces the different absorption characteristics of fetal and maternal blood. The results show that the phantom can generate pulse waves similar to the natural ones. Furthermore, the phantom represents a reference that can be used to evaluate the algorithms for transabdominal, fetal pulse oximetry.

  5. Measurements of air kerma index in computed tomography: a comparison among methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, T. C.; Mourao, A. P.; Da Silva, T. A., E-mail: alonso@cdtn.br [Universidade Federal de Minas Gerais, Programa de Ciencia y Tecnicas Nucleares, Av. Pres. Antonio Carlos 6627, Pampulha, 31270-901 Belo Horizonte, Minas Gerais (Brazil)

    2016-10-15

    Computed tomography (CT) has become the most important and widely used technique for diagnosis purpose. As CT exams impart high doses to patients in comparison to other radiologist techniques, reliable dosimetry is required. Dosimetry in CT is done in terms of air kerma index in air or in a phantom measured by a pencil ionization chamber under a single X-ray tube rotation. In this work, a comparison among CT dosimetric quantities measured by an UNFORS pencil ionization chamber, MTS-N RADOS thermoluminescent dosimeters and GAFCHROMIC XR-CT radiochromic film was done. The three dosimetric systems were properly calibrated in X-ray reference radiations in a calibration laboratory. CT dosimetric quantities were measured in CT Bright Speed GE Medical Systems Inc., scanner in a PMMA trunk phantom and a comparison among the three dosimetric techniques was done. (Author)

  6. A new methodology for measuring time correlations and excite states of atoms and nuclei

    International Nuclear Information System (INIS)

    Cavalcante, M.A.

    1989-01-01

    A system for measuring time correlation of physical phenomena events in the range of 10 -7 to 10 5 sec is proposed, and his results presented. This system, is based on a sequential time scale which is controlled by a precision quartz oscillator; the zero time of observation is set by means of a JK Flip-Flop, which is operated by a negative transition of pulse in coincidence with the pulse from a detector which marks the time zero of the event (precedent pulse). This electronic system (named digital chronoanalizer) was used in the measurement of excited states of nuclei as well as for the determination of time fluctuations in physical phenomena, such as the time lag in a halogen Geiger counter and is the measurement of the 60 KeV excited state of N P 237 . (author)

  7. Methodology of measurement of thermal neutron time decay constant in Canberra 35+ MCA system

    Energy Technology Data Exchange (ETDEWEB)

    Drozdowicz, K.; Gabanska, B.; Igielski, A.; Krynicka, E.; Woznicka, U. [Institute of Nuclear Physics, Cracow (Poland)

    1993-12-31

    A method of the thermal neutron time decay constant measurement in small bounded media is presented. A 14 MeV pulsed neutron generator is the neutron source. The system of recording of a die-away curve of thermal neutrons consists of a {sup 3}He detector and of a multichannel time analyzer based on analyzer Canberra 35+ with multi scaler module MCS 7880 (microsecond range). Optimum parameters for the measuring system are considered. Experimental verification of a dead time of the instrumentation system is made and a count-loss correction is incorporated into the data treatment. An attention is paid to evaluate with a high accuracy the fundamental mode decay constant of the registered decaying curve. A new procedure of the determination of the decay constant by a multiple recording of the die-away curve is presented and results of test measurements are shown. (author). 11 refs, 12 figs, 4 tabs.

  8. Measurements of air kerma index in computed tomography: a comparison among methodologies

    International Nuclear Information System (INIS)

    Alonso, T. C.; Mourao, A. P.; Da Silva, T. A.

    2016-10-01

    Computed tomography (CT) has become the most important and widely used technique for diagnosis purpose. As CT exams impart high doses to patients in comparison to other radiologist techniques, reliable dosimetry is required. Dosimetry in CT is done in terms of air kerma index in air or in a phantom measured by a pencil ionization chamber under a single X-ray tube rotation. In this work, a comparison among CT dosimetric quantities measured by an UNFORS pencil ionization chamber, MTS-N RADOS thermoluminescent dosimeters and GAFCHROMIC XR-CT radiochromic film was done. The three dosimetric systems were properly calibrated in X-ray reference radiations in a calibration laboratory. CT dosimetric quantities were measured in CT Bright Speed GE Medical Systems Inc., scanner in a PMMA trunk phantom and a comparison among the three dosimetric techniques was done. (Author)

  9. Methodology of measurement of thermal neutron time decay constant in Canberra 35+ MCA system

    International Nuclear Information System (INIS)

    Drozdowicz, K.; Gabanska, B.; Igielski, A.; Krynicka, E.; Woznicka, U.

    1993-01-01

    A method of the thermal neutron time decay constant measurement in small bounded media is presented. A 14 MeV pulsed neutron generator is the neutron source. The system of recording of a die-away curve of thermal neutrons consists of a 3 He detector and of a multichannel time analyzer based on analyzer Canberra 35+ with multi scaler module MCS 7880 (microsecond range). Optimum parameters for the measuring system are considered. Experimental verification of a dead time of the instrumentation system is made and a count-loss correction is incorporated into the data treatment. An attention is paid to evaluate with a high accuracy the fundamental mode decay constant of the registered decaying curve. A new procedure of the determination of the decay constant by a multiple recording of the die-away curve is presented and results of test measurements are shown. (author). 11 refs, 12 figs, 4 tabs

  10. Methodology of measurement of thermal neutron time decay constant in Canberra 35+ MCA system

    Energy Technology Data Exchange (ETDEWEB)

    Drozdowicz, K; Gabanska, B; Igielski, A; Krynicka, E; Woznicka, U [Institute of Nuclear Physics, Cracow (Poland)

    1994-12-31

    A method of the thermal neutron time decay constant measurement in small bounded media is presented. A 14 MeV pulsed neutron generator is the neutron source. The system of recording of a die-away curve of thermal neutrons consists of a {sup 3}He detector and of a multichannel time analyzer based on analyzer Canberra 35+ with multi scaler module MCS 7880 (microsecond range). Optimum parameters for the measuring system are considered. Experimental verification of a dead time of the instrumentation system is made and a count-loss correction is incorporated into the data treatment. An attention is paid to evaluate with a high accuracy the fundamental mode decay constant of the registered decaying curve. A new procedure of the determination of the decay constant by a multiple recording of the die-away curve is presented and results of test measurements are shown. (author). 11 refs, 12 figs, 4 tabs.

  11. Trends in Child Poverty Using an Improved Measure of Poverty.

    Science.gov (United States)

    Wimer, Christopher; Nam, JaeHyun; Waldfogel, Jane; Fox, Liana

    2016-04-01

    The official measure of poverty has been used to assess trends in children's poverty rates for many decades. But because of flaws in official poverty statistics, these basic trends have the potential to be misleading. We use an augmented Current Population Survey data set that calculates an improved measure of poverty to reexamine child poverty rates between 1967 and 2012. This measure, the Anchored Supplemental Poverty Measure, is based partially on the US Census Bureau and Bureau of Labor Statistics' new Supplemental Poverty Measure. We focus on 3 age groups of children, those aged 0 to 5, 6 to 11, and 12 to 17 years. Young children have the highest poverty rates, both historically and today. However, among all age groups, long-term poverty trends have been more favorable than official statistics would suggest. This is entirely due to the effect of counting resources from government policies and programs, which have reduced poverty rates substantially for children of all ages. However, despite this progress, considerable disparities in the risk of poverty continue to exist by education level and family structure. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  12. Precision Measurement and Improvement of e+, e- Storage Rings

    International Nuclear Information System (INIS)

    Yan, Y.T.; Cai, Y.; Colocho, W.; Decker, F-J.; Seeman, J.; Sullivan, M.; Turner, J.; Wienands, U.; Woodley, M.; Yocky, G.

    2006-01-01

    Through horizontal and vertical excitations, we have been able to make a precision measurement of linear geometric optics parameters with a Model-Independent Analysis (MIA). We have also been able to build up a computer model that matches the real accelerator in linear geometric optics with an SVD-enhanced Least-square fitting process. Recently, with the addition of longitudinal excitation, we are able to build up a computer virtual machine that matches the real accelerators in linear optics including dispersion without additional fitting variables. With this optics-matched virtual machine, we are able to find solutions that make changes of selected normal and skew quadrupoles for machine optics improvement. It has made major contributions to improve PEP-II optics and luminosity. Examples from application to PEP-II machines will be presented

  13. Can ensemble condition in a hall be improved and measured?

    DEFF Research Database (Denmark)

    Gade, Anders Christian

    1988-01-01

    of the ceiling reflectors; and (c) changing the position of the orchestra on the platform. These variables were then tested in full scale experiments in the hall including subjective evaluation by the orchestra in order to verify their effects under practical conditions. New objective parameters, which showed......In collaboration with the Danish Broadcasting Corporation an extensive series of experiments has been carried out in The Danish Radio Concert Hall with the practical purpose of trying to improve the ensemble conditions on the platform for the resident symphony orchestra. First, a series...... very high correlations with the subjective data, also made it possible to compare the improvements with conditions as recently measured in famous European Halls. Besides providing the needed results, the experiments also shed some light on how musicians change their criteria for judging acoustic...

  14. Quality improvement in neurology: dementia management quality measures.

    Science.gov (United States)

    Odenheimer, Germaine; Borson, Soo; Sanders, Amy E; Swain-Eng, Rebecca J; Kyomen, Helen H; Tierney, Samantha; Gitlin, Laura; Forciea, Mary Ann; Absher, John; Shega, Joseph; Johnson, Jerry

    2014-03-01

    Professional and advocacy organizations have long urged that dementia should be recognized and properly diagnosed. With the passage of the National Alzheimer's Project Act in 2011, an Advisory Council for Alzheimer's Research, Care, and Services was convened to advise the Department of Health and Human Services. In May 2012, the Council produced the first National Plan to address Alzheimer's disease, and prominent in its recommendations is a call for quality measures suitable for evaluating and tracking dementia care in clinical settings. Although other efforts have been made to set dementia care quality standards, such as those pioneered by RAND in its series Assessing Care of Vulnerable Elders (ACOVE), practitioners, healthcare systems, and insurers have not widely embraced implementation. This executive summary (full manuscript available at www.neurology.org) reports on a new measurement set for dementia management developed by an interdisciplinary Dementia Measures Work Group (DWG) representing the major national organizations and advocacy organizations concerned with the care of individuals with dementia. The American Academy of Neurology (AAN), the American Geriatrics Society, the American Medical Directors Association, the American Psychiatric Association, and the American Medical Association-convened Physician Consortium for Performance Improvement led this effort. The ACOVE measures and the measurement set described here apply to individuals whose dementia has already been identified and properly diagnosed. Although similar in concept to ACOVE, the DWG measurement set differs in several important ways; it includes all stages of dementia in a single measure set, calls for the use of functional staging in planning care, prompts the use of validated instruments in patient and caregiver assessment and intervention, highlights the relevance of using palliative care concepts to guide care before the advanced stages of illness, and provides evidence-based support

  15. A model to improve efficiency and effectiveness of safeguards measures

    International Nuclear Information System (INIS)

    D'Amato, Eduardo; Llacer, Carlos; Vicens, Hugo

    2001-01-01

    Full text: The main purpose of our current studies is to analyse the measures to be adopted tending to integrate the traditional safeguard measures to the ones stated in the Additional Protocol (AP). A simplified nuclear fuel cycle model is considered to draw some conclusions on the application of integrated safeguard measures. This paper includes a briefing, describing the historical review that gave birth to the A.P. and proposes a model to help the control bodies in the making decision process. In May 1997, the Board of Governors approved the Model Additional Protocol (MAP) which aimed at strengthening the effectiveness and improving the efficiency of safeguard measures. For States under a comprehensive safeguard agreement the measures adopted provide credible assurance on the absence of undeclared nuclear material and activities. In September 1999, the governments of Argentina and Brazil formally announced in the Board of Governors that both countries would start preliminary consultations on one adapted MAP applied to the Agreement between the Republic of Argentina, the Federative Republic of Brazil, the Brazilian-Argentine Agency for Accounting and Control of Nuclear Materials and the International Atomic Energy Agency for the Application of Safeguards (Quatripartite Agreement/INFCIRC 435). In December 1999, a first draft of the above mentioned document was provided as a starting point of discussion. During the year 2000 some modifications to the original draft took place. These were the initial steps in the process aiming at reaching the adequate conditions to adhere to the A.P. in each country in a future Having in mind the future AP implementation, the safeguards officers of the Regulatory Body of Argentina (ARN) began to think about the future simultaneous application of the two types of safeguards measures, the traditional and the non traditional ones, what should converge in an integrated system. By traditional safeguards it is understood quantitative

  16. High frequency measurement of P- and S-wave velocities on crystalline rock massif surface - methodology of measurement

    Science.gov (United States)

    Vilhelm, Jan; Slavík, Lubomír

    2014-05-01

    For the purpose of non-destructive monitoring of rock properties in the underground excavation it is possible to perform repeated high-accuracy P- and S-wave velocity measurements. This contribution deals with preliminary results gained during the preparation of micro-seismic long-term monitoring system. The field velocity measurements were made by pulse-transmission technique directly on the rock outcrop (granite) in Bedrichov gallery (northern Bohemia). The gallery at the experimental site was excavated using TBM (Tunnel Boring Machine) and it is used for drinking water supply, which is conveyed in a pipe. The stable measuring system and its automatic operation lead to the use of piezoceramic transducers both as a seismic source and as a receiver. The length of measuring base at gallery wall was from 0.5 to 3 meters. Different transducer coupling possibilities were tested namely with regard of repeatability of velocity determination. The arrangement of measuring system on the surface of the rock massif causes better sensitivity of S-transducers for P-wave measurement compared with the P-transducers. Similarly P-transducers were found more suitable for S-wave velocity determination then P-transducers. The frequency dependent attenuation of fresh rock massif results in limited frequency content of registered seismic signals. It was found that at the distance between the seismic source and receiver from 0.5 m the frequency components above 40 kHz are significantly attenuated. Therefore for the excitation of seismic wave 100 kHz transducers are most suitable. The limited frequency range should be also taken into account for the shape of electric impulse used for exciting of piezoceramic transducer. The spike pulse generates broad-band seismic signal, short in the time domain. However its energy after low-pass filtration in the rock is significantly lower than the energy of seismic signal generated by square wave pulse. Acknowledgments: This work was partially

  17. Improved GPS-based Satellite Relative Navigation Using Femtosecond Laser Relative Distance Measurements

    Directory of Open Access Journals (Sweden)

    Hyungjik Oh

    2016-03-01

    Full Text Available This study developed an approach for improving Carrier-phase Differential Global Positioning System (CDGPS based realtime satellite relative navigation by applying laser baseline measurement data. The robustness against the space operational environment was considered, and a Synthetic Wavelength Interferometer (SWI algorithm based on a femtosecond laser measurement model was developed. The phase differences between two laser wavelengths were combined to measure precise distance. Generated laser data were used to improve estimation accuracy for the float ambiguity of CDGPS data. Relative navigation simulations in real-time were performed using the extended Kalman filter algorithm. The GPS and laser-combined relative navigation accuracy was compared with GPS-only relative navigation solutions to determine the impact of laser data on relative navigation. In numerical simulations, the success rate of integer ambiguity resolution increased when laser data was added to GPS data. The relative navigational errors also improved five-fold and two-fold, relative to the GPS-only error, for 250 m and 5 km initial relative distances, respectively. The methodology developed in this study is suitable for application to future satellite formation-flying missions.

  18. METHODOLOGICAL APPROACH FOR MEASURING PRIORITY DBPS IN REVERSE OSMOSIS CONCENTRATED DRINKING WATER

    Science.gov (United States)

    Many disinfection by-products (DBPs) are formed when drinking water is chlorinated, but only a few are routinely measured or regulated. Various studies have revealed a plethora of DBPs for which sensitive and quantitative analytical methods have always been a major limiting facto...

  19. A new integrative methodology for desertification studies based on magnetic and short-lived radioisotope measurements

    International Nuclear Information System (INIS)

    Oldfield, F.; Higgitt, S.R.; Maher, B.A.; Appleby, P.G.; Scoullos, M.

    1986-01-01

    The use of mineral magnetic measurements and short-lived radioisotope studies with 210 Pb and 137 Cs is discussed within the ecosystem watershed conceptual framework. Used in conjunction with geomorphological, sedimentological, palaeoecological and geochemical techniques, these methods can form the core of an integrated multidisciplinary study of desertification and erosion processes on all relevant temporal and spatial scales. 30 refs.; 4 figs

  20. Cerebral blood measurements in cerebral vascular disease: methodological and clinical aspects

    International Nuclear Information System (INIS)

    Fieschi, C.; Lenzi, G.L.

    1982-01-01

    This paper is devoted mainly to studies performed on acute cerebral vascular disease with the invasive techniques for the measurement of regional cerebral blood flow (rCBF). The principles of the rCBF method are outlined and the following techniques are described in detail: xenon-133 inhalation method, xenon-133 intravenous method and emission tomography methods. (C.F.)

  1. Complete methodology on generating realistic wind speed profiles based on measurements

    DEFF Research Database (Denmark)

    Gavriluta, Catalin; Spataru, Sergiu; Mosincat, Ioan

    2012-01-01

    , wind modelling for medium and large time scales is poorly treated in the present literature. This paper presents methods for generating realistic wind speed profiles based on real measurements. The wind speed profile is divided in a low- frequency component (describing long term variations...

  2. A methodology to measure the effectiveness of academic recruitment and turnover

    DEFF Research Database (Denmark)

    Abramo, Giovanni; D’Angelo, Ciriaco Andrea; Rosati, Francesco

    2016-01-01

    We propose a method to measure the effectiveness of the recruitment and turnover of professors, in terms of their research performance. The method presented is applied tothe case of Italian universities over the period 2008–2012. The work then analyses thecorrelation between the indicators of eff...

  3. Automated landmark extraction for orthodontic measurement of faces using the 3-camera photogrammetry methodology.

    Science.gov (United States)

    Deli, Roberto; Di Gioia, Eliana; Galantucci, Luigi Maria; Percoco, Gianluca

    2010-01-01

    To set up a three-dimensional photogrammetric scanning system for precise landmark measurements, without any physical contact, using a low-cost and noninvasive digital photogrammetric solution, for supporting several necessity in clinical orthodontics and/or surgery diagnosis. Thirty coded targets were directly applied onto the subject's face on the soft tissue landmarks, and then, 3 simultaneous photos were acquired using photogrammetry, at room light conditions. For comparison, a dummy head was digitized both with a photogrammetric technique and with the laser scanner Minolta Vivid 910i (Konica Minolta, Tokyo, Japan). The precise measurement of the landmarks is ranged between 0.017 and 0.029 mm. The system automatically measures spatial position of face landmarks, from which distances and angles can be obtained. The facial measurements were compared with those done using laser scanning and manual caliper. The adopted method gives higher precision than the others (0.022-mm mean value on points and 0.038-mm mean value on linear distances on a dummy head), is simple, and can be used easily as a standard routine. The study demonstrated the validity of photogrammetry for accurate digitization of human face landmarks. This research points out the potential of this low-cost photogrammetry approach for medical digitization.

  4. Difference in blood pressure measurements between arms: methodological and clinical implications.

    Science.gov (United States)

    Clark, Christopher E

    2015-01-01

    Differences in blood pressure measurements between arms are commonly encountered in clinical practice. If such differences are not excluded they can delay the diagnosis of hypertension and can lead to poorer control of blood pressure levels. Differences in blood pressure measurements between arms are associated cross sectionally with other signs of vascular disease such as peripheral arterial disease or cerebrovascular disease. Differences are also associated prospectively with increased cardiovascular mortality and morbidity and all cause mortality. Numbers of publications on inter-arm difference are rising year on year, indicating a growing interest in the phenomenon. The prevalence of an inter-arm difference varies widely between reports, and is correlated with the underlying cardiovascular risk of the population studied. Prevalence is also sensitive to the method of measurement used. This review discusses the prevalence of an inter-arm difference in different populations and addresses current best practice for the detection and the measurement of a difference. The evidence for clinical and for vascular associations of an inter-arm difference is presented in considering the emerging role of an inter-arm blood pressure difference as a novel risk factor for increased cardiovascular morbidity and mortality. Competing aetiological explanations for an inter-arm difference are explored, and gaps in our current understanding of this sign, along with areas in need of further research, are considered.

  5. Minimizing measurement uncertainties of coniferous needle-leaf optical properties, part I: methodological review

    NARCIS (Netherlands)

    Yanez Rausell, L.; Schaepman, M.E.; Clevers, J.G.P.W.; Malenovsky, Z.

    2014-01-01

    Optical properties (OPs) of non-flat narrow plant leaves, i.e., coniferous needles, are extensively used by the remote sensing community, in particular for calibration and validation of radiative transfer models at leaf and canopy level. Optical measurements of such small living elements are,

  6. Improvements to measuring water flux in the vadose zone.

    Science.gov (United States)

    Masarik, Kevin C; Norman, John M; Brye, Kristofor R; Baker, John M

    2004-01-01

    Evaluating the impact of land use practices on ground water quality has been difficult because few techniques are capable of monitoring the quality and quantity of soil water flow below the root zone without disturbing the soil profile and affecting natural flow processes. A recently introduced method, known as equilibrium tension lysimetry, was a major improvement but it was not a true equilibrium since it still required manual intervention to maintain proper lysimeter suction. We addressed this issue by developing an automated equilibrium tension lysimeter (AETL) system that continuously matches lysimeter tension to soil-water matric potential of the surrounding soil. The soil-water matric potential of the bulk soil is measured with a heat-dissipation sensor, and a small DC pump is used to apply suction to a lysimeter. The improved automated approach reported here was tested in the field for a 12-mo period. Powered by a small 12-V rechargeable battery, the AETLs were able to continuously match lysimeter suction to soil-water matric potential for 2-wk periods with minimal human attention, along with the added benefit of collecting continuous soil-water matric potential data. We also demonstrated, in the laboratory, methods for continuous measurement of water depth in the AETL, a capability that quantifies drainage on a 10-min interval, making it a true water-flux meter. Equilibrium tension lysimeters have already been demonstrated to be a reliable method of measuring drainage flux, and the further improvements have created a more effective device for studying water drainage and chemical leaching through the soil matrix.

  7. Metabolic tumour volumes measured at staging in lymphoma: methodological evaluation on phantom experiments and patients

    Energy Technology Data Exchange (ETDEWEB)

    Meignan, Michel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Paris-Est University, Service de Medecine Nucleaire, EAC CNRS 7054, Hopital Henri Mondor AP-HP, Creteil (France); Sasanelli, Myriam; Itti, Emmanuel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Casasnovas, Rene Olivier [CHU Le Bocage, Department of Hematology, Dijon (France); Luminari, Stefano [University of Modena and Reggio Emilia, Department of Diagnostic, Clinic and Public Health Medicine, Modena (Italy); Fioroni, Federica [Santa Maria Nuova Hospital-IRCCS, Department of Medical Physics, Reggio Emilia (Italy); Coriani, Chiara [Santa Maria Nuova Hospital-IRCCS, Department of Radiology, Reggio Emilia (Italy); Masset, Helene [Henri Mondor Hospital, Department of Radiophysics, Creteil (France); Gobbi, Paolo G. [University of Pavia, Department of Internal Medicine and Gastroenterology, Fondazione IRCCS Policlinico San Matteo, Pavia (Italy); Merli, Francesco [Santa Maria Nuova Hospital-IRCCS, Department of Hematology, Reggio Emilia (Italy); Versari, Annibale [Santa Maria Nuova Hospital-IRCCS, Department of Nuclear Medicine, Reggio Emilia (Italy)

    2014-06-15

    The presence of a bulky tumour at staging on CT is an independent prognostic factor in malignant lymphomas. However, its prognostic value is limited in diffuse disease. Total metabolic tumour volume (TMTV) determined on {sup 18}F-FDG PET/CT could give a better evaluation of the total tumour burden and may help patient stratification. Different methods of TMTV measurement established in phantoms simulating lymphoma tumours were investigated and validated in 40 patients with Hodgkin lymphoma and diffuse large B-cell lymphoma. Data were processed by two nuclear medicine physicians in Reggio Emilia and Creteil. Nineteen phantoms filled with {sup 18}F-saline were scanned; these comprised spherical or irregular volumes from 0.5 to 650 cm{sup 3} with tumour-to-background ratios from 1.65 to 40. Volumes were measured with different SUVmax thresholds. In patients, TMTV was measured on PET at staging by two methods: volumes of individual lesions were measured using a fixed 41 % SUVmax threshold (TMTV{sub 41}) and a variable visually adjusted SUVmax threshold (TMTV{sub var}). In phantoms, the 41 % threshold gave the best concordance between measured and actual volumes. Interobserver agreement was almost perfect. In patients, the agreement between the reviewers for TMTV{sub 41} measurement was substantial (ρ {sub c} = 0.986, CI 0.97 - 0.99) and the difference between the means was not significant (212 ± 218 cm{sup 3} for Creteil vs. 206 ± 219 cm{sup 3} for Reggio Emilia, P = 0.65). By contrast the agreement was poor for TMTV{sub var}. There was a significant direct correlation between TMTV{sub 41} and normalized LDH (r = 0.652, CI 0.42 - 0.8, P <0.001). Higher disease stages and bulky tumour were associated with higher TMTV{sub 41}, but high TMTV{sub 41} could be found in patients with stage 1/2 or nonbulky tumour. Measurement of baseline TMTV in lymphoma using a fixed 41% SUVmax threshold is reproducible and correlates with the other parameters for tumour mass evaluation

  8. Methodological Considerations and Comparisons of Measurement Results for Extracellular Proteolytic Enzyme Activities in Seawater

    Directory of Open Access Journals (Sweden)

    Yumiko Obayashi

    2017-10-01

    Full Text Available Microbial extracellular hydrolytic enzymes that degrade organic matter in aquatic ecosystems play key roles in the biogeochemical carbon cycle. To provide linkages between hydrolytic enzyme activities and genomic or metabolomic studies in aquatic environments, reliable measurements are required for many samples at one time. Extracellular proteases are one of the most important classes of enzymes in aquatic microbial ecosystems, and protease activities in seawater are commonly measured using fluorogenic model substrates. Here, we examined several concerns for measurements of extracellular protease activities (aminopeptidases, and trypsin-type, and chymotrypsin-type activities in seawater. Using a fluorometric microplate reader with low protein binding, 96-well microplates produced reliable enzymatic activity readings, while use of regular polystyrene microplates produced readings that showed significant underestimation, especially for trypsin-type proteases. From the results of kinetic experiments, this underestimation was thought to be attributable to the adsorption of both enzymes and substrates onto the microplate. We also examined solvent type and concentration in the working solution of oligopeptide-analog fluorogenic substrates using dimethyl sulfoxide (DMSO and 2-methoxyethanol (MTXE. The results showed that both 2% (final concentration of solvent in the mixture of seawater sample and substrate working solution DMSO and 2% MTXE provide similarly reliable data for most of the tested substrates, except for some substrates which did not dissolve completely in these assay conditions. Sample containers are also important to maintain the level of enzyme activity in natural seawater samples. In a small polypropylene containers (e.g., standard 50-mL centrifugal tube, protease activities in seawater sample rapidly decreased, and it caused underestimation of natural activities, especially for trypsin-type and chymotrypsin-type proteases. In

  9. Improvement of fire protection measures for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    Improvements of fire protection measures for nuclear power plants were performed as following items: Development of fire hazard analysis method. Application of developed Fire Dynamic Tool to actual plants (FDT{sup S}), With regard to fire tests for the fire data acquisition, cable fire test and High Energy Arcing Faults (HEAF) fire test were performed. Implementation of fire hazard analysis code and simulation were performed as following items: Fire analysis codes FDS, SYLVIA, and CFAST were implemented in order to analyze the fire progression phenomena. Trial simulation of HEAF accident of Onagawa NPP in Tohoku earthquake. (author)

  10. Improvement of fire protection measures for nuclear power plants

    International Nuclear Information System (INIS)

    2012-01-01

    Improvements of fire protection measures for nuclear power plants were performed as following items: Development of fire hazard analysis method. Application of developed Fire Dynamic tool to actual plants, With regard to fire tests for the fire data acquisition, cable fire test and oil fire test were performed. Implementation of fire hazard analysis code and simulation were performed as following items: Fire analysis codes FDS, SYLVIA, CFAST were implemented in order to analyze the fire progression phenomena, Trial simulation of fire hazard as Metal-Clad Switch Gear Fire of ONAGAWA NPP in Tohoku earthquake (HEAF accident). (author)

  11. Building, measuring and improving public confidence in the nuclear regulator

    International Nuclear Information System (INIS)

    2006-01-01

    An important factor for public confidence in the nuclear regulator is the general public trust of the government and its representatives, which is clearly not the same in all countries. Likewise, cultural differences between countries can be considerable, and similar means of communication between government authorities and the public may not be universally effective. Nevertheless, this workshop identified a number of common principles for the communication of nuclear regulatory decisions that can be recommended to all regulators. They have been cited in particular for their ability to help build, measure and/or improve overall public confidence in the nuclear regulator. (author)

  12. Calculating first-order sensitivity measures: A benchmark of some recent methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Gatelli, D. [Joint Research Centre, European Commission, TP361, Institute of the Protection and Security of the Citizen, Via E. Fermi 2749, 21027 Ispra (Italy)], E-mail: Debora.gatelli@jrc.it; Kucherenko, S. [Imperial College London, London (United Kingdom); Ratto, M.; Tarantola, S. [Joint Research Centre, European Commission, TP361, Institute of the Protection and Security of the Citizen, Via E. Fermi 2749, 21027 Ispra (Italy)

    2009-07-15

    This work compares three different global sensitivity analysis techniques, namely the state-dependent parameter (SDP) modelling, the random balance designs, and the improved formulas of the Sobol' sensitivity indices. These techniques are not yet commonly known in the literature. Strengths and weaknesses of each technique in terms of efficiency and computational cost are highlighted, thus enabling the user to choose the more suitable method depending on the computational model analysed. Two test functions proposed in the literature are considered. Computational costs and convergence rates for each function are compared and discussed.

  13. Current status and methodological aspects on the measurement of glomerular filtration rate

    International Nuclear Information System (INIS)

    Froissart, M.; Hignette, C.; Kolar, P.; Prigent, A.; Paillard, M.

    1995-01-01

    Determination of the glomerular filtration rate (GFR) contribute to our understanding of kidney physiology and pathophysiology . Moreover, determination of GFR is of clinical importance in assessing the diagnosis and the progression of renal disease. The purpose of this article is to review the technical performance and results of GFR measurements, including the classical inulin clearance technique and more recent alternative clearance techniques using radioisotope-labelled filtration markers, bolus infusion and spontaneous bladder emptying. Some simplified techniques avoiding urinary collection are also described. We conclude that estimation of GFR from renal and in some cases plasmatic clearances is accurate and more convenient than the classical inulin clearance technique. Such measurements of GFR should be included both in clinical practice and clinical research. (authors). 80 refs., 5 figs., 1 tab

  14. Critical experiments, measurements, and analyses to establish a crack arrest methodology for nuclear pressure vessel steels

    International Nuclear Information System (INIS)

    Hahn, G.T.

    1977-01-01

    Substantial progress was made in three important areas: crack propagation and arrest theory, two-dimensional dynamic crack propagation analyses, and a laboratory test method for the material property data base. The major findings were as follows: Measurements of run-arrest events lent support to the dynamic, energy conservation theory of crack arrest. A two-dimensional, dynamic, finite-difference analysis, including inertia forces and thermal gradients, was developed. The analysis was successfully applied to run-arrest events in DCB (double-cantilever-beam) and SEN (single-edge notched) test pieces. A simplified procedure for measuring K/sub D/ and K/sub Im/ values with ordinary and duplex DCB specimens was demonstrated. The procedure employs a dynamic analysis of the crack length at arrest and requires no special instrumentation. The new method was applied to ''duplex'' specimens to measure the large K/sub D/ values displayed by A533B steel above the nil-ductility temperature. K/sub D/ crack velocity curves and K/sub Im/ values of two heats of A533B steel and the corresponding values for the plane strain fracture toughness associated with static initiation (K/sub Ic/), dynamic initiation (K/sub Id/), and the static stress intensity at crack arrest (K/sub Ia/) were measured. Possible relations among these toughness indices are identified. During the past year the principal investigators of the participating groups reached agreement on a crack arrest theory appropriate for the pressure vessel problem. 7 figures

  15. A Performance Measurement and Implementation Methodology in a Department of Defense CIM (Computer Integrated Manufacturing) Environment

    Science.gov (United States)

    1988-01-24

    vanes.-The new facility is currently being called the Engine Blade/ Vape Facility (EB/VF). There are three primary goals in automating this proc..e...earlier, the search led primarily into the areas of CIM Justification, Automation Strategies , Performance Measurement, and Integration issues. Of...of living, has been steadily eroding. One dangerous trend that has developed in keenly competitive world markets , says Rohan [33], has been for U.S

  16. Bone mineral content measurement in small infants by single-photon absorptiometry: current methodologic issues

    International Nuclear Information System (INIS)

    Steichen, J.J.; Asch, P.A.; Tsang, R.C.

    1988-01-01

    Single-photon absorptiometry (SPA), developed in 1963 and adapted for infants by Steichen et al. in 1976, is an important tool to quantitate bone mineralization in infants. Studies of infants in which SPA was used include studies of fetal bone mineralization and postnatal bone mineralization in very low birth weight infants. The SPA technique has also been used as a research tool to investigate longitudinal bone mineralization and to study the effect of nutrition and disease processes such as rickets or osteopenia of prematurity. At present, it has little direct clinical application for diagnosing bone disease in single patients. The bones most often used to measure bone mineral content (BMC) are the radius, the ulna, and, less often, the humerus. The radius appears to be preferred as a suitable bone to measure BMC in infants. It is easily accessible; anatomic reference points are easily palpated and have a constant relationship to the radial mid-shaft site; soft tissue does not affect either palpation of anatomic reference points or BMC quantitation in vivo. The peripheral location of the radius minimizes body radiation exposure. Trabecular and cortical bone can be measured separately. Extensive background studies exist on radial BMC in small infants. Most important, the radius has a relatively long zone of constant BMC. Finally, SPA for BMC in the radius has a high degree of precision and accuracy. 61 references

  17. Manipulating measurement scales in medical statistical analysis and data mining: A review of methodologies

    Directory of Open Access Journals (Sweden)

    Hamid Reza Marateb

    2014-01-01

    Full Text Available Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal-variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD. Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables.

  18. Manipulating measurement scales in medical statistical analysis and data mining: A review of methodologies

    Science.gov (United States)

    Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario

    2014-01-01

    Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565

  19. Statistical inference with quantum measurements: methodologies for nitrogen vacancy centers in diamond

    Science.gov (United States)

    Hincks, Ian; Granade, Christopher; Cory, David G.

    2018-01-01

    The analysis of photon count data from the standard nitrogen vacancy (NV) measurement process is treated as a statistical inference problem. This has applications toward gaining better and more rigorous error bars for tasks such as parameter estimation (e.g. magnetometry), tomography, and randomized benchmarking. We start by providing a summary of the standard phenomenological model of the NV optical process in terms of Lindblad jump operators. This model is used to derive random variables describing emitted photons during measurement, to which finite visibility, dark counts, and imperfect state preparation are added. NV spin-state measurement is then stated as an abstract statistical inference problem consisting of an underlying biased coin obstructed by three Poisson rates. Relevant frequentist and Bayesian estimators are provided, discussed, and quantitatively compared. We show numerically that the risk of the maximum likelihood estimator is well approximated by the Cramér-Rao bound, for which we provide a simple formula. Of the estimators, we in particular promote the Bayes estimator, owing to its slightly better risk performance, and straightforward error propagation into more complex experiments. This is illustrated on experimental data, where quantum Hamiltonian learning is performed and cross-validated in a fully Bayesian setting, and compared to a more traditional weighted least squares fit.

  20. Dual photon absorptiometry measurement of the lumbar bone mineral content. Methodology - Reproductibility - Normal values

    International Nuclear Information System (INIS)

    Braillon, P.; Duboeuf, F.; Delmas, P.D.; Meunier, P.J.

    1987-01-01

    Measurements were made with a DPA apparatus (Novo Lab 22a) on different phantoms and on volunteers in an attempt to evaluate the system precision. The reproductibility was found in the range of 0.98 to 4.10 % in the case of in vitro measurements, depending on the geometry of the phantoms used, and in the range of 1.6 to 2.94 % for volunteers after repositioning. Secondly, the BMD in the lumbar spine of normal women and normal men was estimated. In control females, the BMD is well fitted to the age by a cubic regression. The maximum value of the BMD is found in this case at the age of 31.5 and the maximum rate of bone loss takes place at 57. Total bone loss between 31.5 and the elderly is about 32 %. In control males, results are more scattered and are represented by a simple linear regression. The average mineral loss between 30 and 80 years is 11.5 % in this area of measurement [fr

  1. Measurement of leukocyte rheology in vascular disease: clinical rationale and methodology. International Society of Clinical Hemorheology.

    Science.gov (United States)

    Wautier, J L; Schmid-Schönbein, G W; Nash, G B

    1999-01-01

    The measurement of leukocyte rheology in vascular disease is a recent development with a wide range of new opportunities. The International Society of Clinical Hemorheology has asked an expert panel to propose guidelines for the investigation of leukocyte rheology in clinical situations. This article first discusses the mechanical, adhesive and related functional properties of leukocytes (especially neutrophils) which influence their circulation, and establishes the rationale for clinically-related measurements of parameters which describe them. It is concluded that quantitation of leukocyte adhesion molecules, and of their endothelial receptors may assist understanding of leukocyte behaviour in vascular disease, along with measurements of flow resistance of leukocytes, free radical production, degranulation and gene expression. For instance, vascular cell adhesion molecule (VCAM-1) is abnormally present on endothelial cells in atherosclerosis, diabetes mellitus and inflammatory conditions. Soluble forms of intercellular adhesion molecule (ICAM-1) or VCAM can be found elevated in the blood of patients with rheumatoid arthritis or infections disease. In the second part of the article, possible technical approaches are presented and possible avenues for leukocyte rheological investigations are discussed.

  2. Methodology for measurement of diesel particle size distributions from a city bus working in real traffic conditions

    International Nuclear Information System (INIS)

    Armas, O; Gómez, A; Mata, C

    2011-01-01

    The study of particulate matter (PM) and nitrogen oxides emissions of diesel engines is nowadays a necessary step towards pollutant emission reduction. For a complete evaluation of PM emissions and its size characterization, one of the most challenging goals is to adapt the available techniques and the data acquisition procedures to the measurement and to propose a methodology for the interpretation of instantaneous particle size distributions (PSD) of combustion-derived particles produced by a vehicle during real driving conditions. In this work, PSD from the exhaust gas of a city bus operated in real driving conditions with passengers have been measured. For the study, the bus was equipped with a rotating disk diluter coupled to an air supply thermal conditioner (with an evaporating tube), the latter being connected to a TSI Engine Exhaust Particle Sizer spectrometer. The main objective of this work has been to propose an alternative procedure for evaluating the influence of several transient sequences on PSD emitted by a city bus used in real driving conditions with passengers. The transitions studied were those derived from the combination of four possible sequences or categories during real driving conditions: idle, acceleration, deceleration with fuel consumption and deceleration without fuel consumption. The analysis methodology used in this work proved to be a useful tool for a better understanding of the phenomena related to the determination of PSD emitted by a city bus during real driving conditions with passengers

  3. Comparison of noise power spectrum methodologies in measurements by using various electronic portal imaging devices in radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Son, Soon Yong [Dept. of Radiological Technology, Wonkwang Health Science University, Iksan (Korea, Republic of); Choi, Kwan Woo [Dept. of Radiology, Asan Medical Center, Seoul (Korea, Republic of); Jeong, Hoi Woun [Dept. of Radiological Technology, Baekseok Culture University College, Cheonan (Korea, Republic of); Kwon, Kyung Tae [Dep. of Radiological Technology, Dongnam Health University, Suwon (Korea, Republic of); Kim, Ki Won [Dept. of Radiology, Kyung Hee University Hospital at Gang-dong, Seoul (Korea, Republic of); Lee, Young Ah; Son, Jin Hyun; Min, Jung Whan [Shingu University College, Sungnam (Korea, Republic of)

    2016-03-15

    The noise power spectrum (NPS) is one of the most general methods for measuring the noise amplitude and the quality of an image acquired from a uniform radiation field. The purpose of this study was to compare different NPS methodologies by using megavoltage X-ray energies. The NPS evaluation methods in diagnostic radiation were applied to therapy using the International Electro-technical Commission standard (IEC 62220-1). Various radiation therapy (RT) devices such as TrueBeamTM(Varian), BEAMVIEWPLUS(Siemens), iViewGT(Elekta) and ClinacR iX (Varian) were used. In order to measure the region of interest (ROI) of the NPS, we used the following four factors: the overlapping impact, the non-overlapping impact, the flatness and penumbra. As for NPS results, iViewGT(Elekta) had the higher amplitude of noise, compared to BEAMVIEWPLUS (Siemens), TrueBeamTM(Varian) flattening filter, ClinacRiXaS1000(Varian) and TrueBeamTM(Varian) flattening filter free. The present study revealed that various factors could be employed to produce megavoltage imaging (MVI) of the NPS and as a baseline standard for NPS methodologies control in MVI.

  4. Fiber-Optic Temperature and Pressure Sensors Applied to Radiofrequency Thermal Ablation in Liver Phantom: Methodology and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Daniele Tosi

    2015-01-01

    Full Text Available Radiofrequency thermal ablation (RFA is a procedure aimed at interventional cancer care and is applied to the treatment of small- and midsize tumors in lung, kidney, liver, and other tissues. RFA generates a selective high-temperature field in the tissue; temperature values and their persistency are directly related to the mortality rate of tumor cells. Temperature measurement in up to 3–5 points, using electrical thermocouples, belongs to the present clinical practice of RFA and is the foundation of a physical model of the ablation process. Fiber-optic sensors allow extending the detection of biophysical parameters to a vast plurality of sensing points, using miniature and noninvasive technologies that do not alter the RFA pattern. This work addresses the methodology for optical measurement of temperature distribution and pressure using four different fiber-optic technologies: fiber Bragg gratings (FBGs, linearly chirped FBGs (LCFBGs, Rayleigh scattering-based distributed temperature system (DTS, and extrinsic Fabry-Perot interferometry (EFPI. For each instrument, methodology for ex vivo sensing, as well as experimental results, is reported, leading to the application of fiber-optic technologies in vivo. The possibility of using a fiber-optic sensor network, in conjunction with a suitable ablation device, can enable smart ablation procedure whereas ablation parameters are dynamically changed.

  5. Methodology for measurement of diesel particle size distributions from a city bus working in real traffic conditions

    Science.gov (United States)

    Armas, O.; Gómez, A.; Mata, C.

    2011-10-01

    The study of particulate matter (PM) and nitrogen oxides emissions of diesel engines is nowadays a necessary step towards pollutant emission reduction. For a complete evaluation of PM emissions and its size characterization, one of the most challenging goals is to adapt the available techniques and the data acquisition procedures to the measurement and to propose a methodology for the interpretation of instantaneous particle size distributions (PSD) of combustion-derived particles produced by a vehicle during real driving conditions. In this work, PSD from the exhaust gas of a city bus operated in real driving conditions with passengers have been measured. For the study, the bus was equipped with a rotating disk diluter coupled to an air supply thermal conditioner (with an evaporating tube), the latter being connected to a TSI Engine Exhaust Particle Sizer spectrometer. The main objective of this work has been to propose an alternative procedure for evaluating the influence of several transient sequences on PSD emitted by a city bus used in real driving conditions with passengers. The transitions studied were those derived from the combination of four possible sequences or categories during real driving conditions: idle, acceleration, deceleration with fuel consumption and deceleration without fuel consumption. The analysis methodology used in this work proved to be a useful tool for a better understanding of the phenomena related to the determination of PSD emitted by a city bus during real driving conditions with passengers.

  6. Health Data Entanglement and artificial intelligence-based analysis: a brand new methodology to improve the effectiveness of healthcare services

    OpenAIRE

    Capone, A.; Cicchetti, A.; Mennini, F. S.; Marcellusi, A.; Baio, G.; Favato, G.

    2016-01-01

    Healthcare expenses will be the most relevant policy issue for most governments in the EU and in the USA. This expenditure can be associated with two major key categories: demographic and economic drivers. Factors driving healthcare expenditure were rarely recognised, measured and comprehended. An improvement of health data generation and analysis is mandatory, and in order to tackle healthcare spending growth, it may be useful to design and implement an effective, advanced system to generate...

  7. Translation and linguistic validation of the Pediatric Patient-Reported Outcomes Measurement Information System measures into simplified Chinese using cognitive interviewing methodology.

    Science.gov (United States)

    Liu, Yanyan; Hinds, Pamela S; Wang, Jichuan; Correia, Helena; Du, Shizheng; Ding, Jian; Gao, Wen Jun; Yuan, Changrong

    2013-01-01

    The Pediatric Patient-Reported Outcomes Measurement Information System (PROMIS) measures were developed using modern measurement theory and tested in a variety of settings to assess the quality of life, function, and symptoms of children and adolescents experiencing a chronic illness and its treatment. Developed in English, this set of measures had not been translated into Chinese. The objective of this study was to develop the Chinese version of the Pediatric PROMIS measures (C-Ped-PROMIS), specifically 8 short forms, and to pretest the translated measures in children and adolescents through cognitive interviewing methodology. The C-Ped-PROMIS was developed following the standard Functional Assessment of Chronic Illness Therapy Translation Methodology. Bilingual teams from the United States and China reviewed the translation to develop a provisional version, which was then pretested with cognitive interview by probing 10 native Chinese-speaking children aged 8 to 17 years in China. The translation was finalized by the bilingual teams. Most items, response options, and instructions were well understood by the children, and some revisions were made to address patient's comments during the cognitive interview. The results indicated that the C-Ped-PROMIS items were semantically and conceptually equivalent to the original. Children aged 8 to 17 years in China were able to comprehend these measures and express their experience and feelings about illness or their life. The C-Ped-PROMIS is available for psychometric validation. Future work will be directed at translating the rest of the item banks, calibrating them and creating a Chinese final version of the short forms.

  8. An EPR methodology for measuring the London penetration depth for the ceramic superconductors

    Science.gov (United States)

    Rakvin, B.; Mahl, T. A.; Dalal, N. S.

    1990-01-01

    The use is discussed of electron paramagnetic resonance (EPR) as a quick and easily accessible method for measuring the London penetration depth, lambda for the high T(sub c) superconductors. The method utilizes the broadening of the EPR signal, due to the emergence of the magnetic flux lattice, of a free radical adsorbed on the surface of the sample. The second moment, of the EPR signal below T(sub c) is fitted to the Brandt equation for a simple triangular lattice. The precision of this method compares quite favorably with those of the more standard methods such as micro sup(+)SR, Neutron scattering, and magnetic susceptibility.

  9. Improvement of gel strength and melting point of fish gelatin by addition of coenhancers using response surface methodology.

    Science.gov (United States)

    Koli, Jayappa M; Basu, Subrata; Nayak, Binay B; Kannuchamy, Nagalakshmi; Gudipati, Venkateshwarlu

    2011-08-01

    Fish gelatin is a potential alternative to mammalian gelatin. However, poor gel strength and low melting point limit its applications. The study was aimed at improving these properties by adding coenhancers in the range obtained from response surface methodology (RSM) by using Box-Behnken design. Three different coenhancers, MgSO₄, sucrose, and transglutaminase were used as the independent variables for improving the gel strength and melting point of gelatin extracted from Tiger-toothed croaker (Otolithes ruber). Addition of coenhancers at different combinations resulted gel strength and melting point in the range of 150.5 to 240.5 g and 19.5 to 22.5 °C, respectively. The optimal concentrations of coenhancers for predicted maximum gel strength (242.8 g) obtained by RSM were 0.23 M MgSO₄, 12.60% sucrose (w/v), and 5.92 mg/g transglutaminase and for predicted maximum melting point (22.57 °C), the values were 0.24 M MgSO₄, 10.44% sucrose (w/v), and 5.72 mg/g transglutaminase. By addition of coenhancers at these optimal concentrations in verification experiments, the gel strength and melting point were improved from 170 to 240.89 g and 20.3 to 22.7 °C, respectively. These experimental values agreed well with the predicted values demonstrating the fitness of the models. Results from the present study clearly revealed that the addition of coenhancers at a particular combination can improve the gel strength and melting point of fish gelatin to enhance its range of applications. There is a growing interest in the use of fish gelatin as an alternative to mammalian gelatin. However, poor gel strength and low melting point of fish gelatin have limited its commercial applications. The gel strength and melting point of fish gelatin can be increased by incorporation of coenhancers such as magnesium sulphate, sucrose, and transglutaminase. Results of this work help to produce the fish gelatin suitable for wide range of applications in the food industry. © 2011 Institute

  10. Combining tracer flux ratio methodology with low-flying aircraft measurements to estimate dairy farm CH4 emissions

    Science.gov (United States)

    Daube, C.; Conley, S.; Faloona, I. C.; Yacovitch, T. I.; Roscioli, J. R.; Morris, M.; Curry, J.; Arndt, C.; Herndon, S. C.

    2017-12-01

    Livestock activity, enteric fermentation of feed and anaerobic digestion of waste, contributes significantly to the methane budget of the United States (EPA, 2016). Studies question the reported magnitude of these methane sources (Miller et. al., 2013), calling for more detailed research of agricultural animals (Hristov, 2014). Tracer flux ratio is an attractive experimental method to bring to this problem because it does not rely on estimates of atmospheric dispersion. Collection of data occurred during one week at two dairy farms in central California (June, 2016). Each farm varied in size, layout, head count, and general operation. The tracer flux ratio method involves releasing ethane on-site with a known flow rate to serve as a tracer gas. Downwind mixed enhancements in ethane (from the tracer) and methane (from the dairy) were measured, and their ratio used to infer the unknown methane emission rate from the farm. An instrumented van drove transects downwind of each farm on public roads while tracer gases were released on-site, employing the tracer flux ratio methodology to assess simultaneous methane and tracer gas plumes. Flying circles around each farm, a small instrumented aircraft made measurements to perform a mass balance evaluation of methane gas. In the course of these two different methane quantification techniques, we were able to validate yet a third method: tracer flux ratio measured via aircraft. Ground-based tracer release rates were applied to the aircraft-observed methane-to-ethane ratios, yielding whole-site methane emission rates. Never before has the tracer flux ratio method been executed with aircraft measurements. Estimates from this new application closely resemble results from the standard ground-based technique to within their respective uncertainties. Incorporating this new dimension to the tracer flux ratio methodology provides additional context for local plume dynamics and validation of both ground and flight-based data.

  11. In situ measurement of heavy metals in water using portable EDXRF and APDC pre-concentration methodology

    International Nuclear Information System (INIS)

    Melquiades, Fabio L.; Parreira, Paulo S.; Appoloni, Carlos R.; Silva, Wislley D.; Lopes, Fabio

    2007-01-01

    With the objective of identify and quantify metals in water and obtain results in the sampling place, Energy Dispersive X-Ray Fluorescence (EDXRF) methodology with a portable equipment was employed. In this work are presented metal concentration results for water samples from two points of Londrina city. The analysis were in situ, measuring in natura water and samples pre-concentrated in membranes. The work consisted on the use of a portable X-ray tube to excite the samples and a Si-Pin detector with the standard data acquisition electronics to register the spectra. The samples were filtered in membranes for suspended particulate matter retention. After this APDC precipitation methodology was applied for sample pre-concentration with posterior filtering in membranes. For in natura samples were found concentrations of total iron in Capivara River 254 ± 30 mg L -1 and at Igapo Lake 63 ± 9 mg L -1 . For membrane measurements, the results for particulate suspended matter at Capivara River were, in mg L -1 : 31.0 ± 2.5 (Fe), 0.17 ± 0.03 (Cu) and 0.93 ± 0.08 (Pb) and for dissolved iron was 0.038 ± 0.004. For Igapo Lake just Fe was quantified: 1.66 ±0.19 mg L -1 for particulate suspended iron and 0.79 ± 0.11 mg L -1 for dissolved iron. In 4 h of work at field it was possible to filter 14 membranes and measure around 16 samples. The performance of the equipment was very good and the results are satisfactory for in situ measurements employing a portable instrument. (author)

  12. Improved measurement for mothers, newborns and children in the era of the Sustainable Development Goals.

    Science.gov (United States)

    Marchant, Tanya; Bryce, Jennifer; Victora, Cesar; Moran, Allisyn C; Claeson, Mariam; Requejo, Jennifer; Amouzou, Agbessi; Walker, Neff; Boerma, Ties; Grove, John

    2016-06-01

    An urgent priority in maternal, newborn and child health is to accelerate the scale-up of cost-effective essential interventions, especially during labor, the immediate postnatal period and for the treatment of serious infectious diseases and acute malnutrition.  Tracking intervention coverage is a key activity to support scale-up and in this paper we examine priorities in coverage measurement, distinguishing between essential interventions that can be measured now and those that require methodological development. We conceptualized a typology of indicators related to intervention coverage that distinguishes access to care from receipt of an intervention by the population in need.  We then built on documented evidence on coverage measurement to determine the status of indicators for essential interventions and to identify areas for development. Contact indicators from pregnancy to childhood were identified as current indicators for immediate use, but indicators reflecting the quality of care provided during these contacts need development. At each contact point, some essential interventions can be measured now, but the need for development of indicators predominates around interventions at the time of birth and interventions to treat infections. Addressing this need requires improvements in routine facility based data capture, methods for linking provider and community-based data, and improved guidance for effective coverage measurement that reflects the provision of high-quality care. Coverage indicators for some essential interventions can be measured accurately through household surveys and be used to track progress in maternal, newborn and child health.  Other essential interventions currently rely on contact indicators as proxies for coverage but urgent attention is needed to identify new measurement approaches that directly and reliably measure their effective coverage.

  13. Methodologically controlled variations in laboratory and field pH measurements in waterlogged soils

    DEFF Research Database (Denmark)

    Elberling, Bo; Matthiesen, Henning

    2007-01-01

    artefacts is critical. But the study includes agricultural and forest soils for comparison. At a waterlogged site, Laboratory results were compared with three different field methods: calomel pH probes inserted in the soil from pits, pH measurements of soil solution extracted from the soil, and pH profiles...... using a solid-state pH electrode pushed into the soil from the surface. Comparisons between in situ and laboratory methods revealed differences of more than 1 pH unit. The content of dissolved ions in soil solution and field observations of O2 and CO2 concentrations were used in the speciation model...... PHREEQE in order to predict gas exchange processes. Changes in pH in soil solution following equilibrium in the laboratory could be explained mainly by CO2 degassing. Only soil pH measured in situ using either calomel or solid-state probes inserted directly into the soil was not affected by gas exchange...

  14. The organizational stress measure: an integrated methodology for assessing job-stress and targeting organizational interventions.

    Science.gov (United States)

    Spurgeon, Peter; Mazelan, Patti; Barwell, Fred

    2012-02-01

    This paper briefly describes the OSM (Organizational Stress Measure) which was developed over a decade ago and has evolved to become a well-established practical method not only for assessing wellbeing at work but also as a cost-effective strategy to tackle workplace stress. The OSM measures perceived organizational pressures and felt individual strains within the same instrument, and provides a rich and subtle picture of both the organizational culture and the personal perspectives of the constituent staff groups. There are many types of organizational pressure that may impact upon the wellbeing and potential effectiveness of staff including skill shortages, ineffective strategic planning and poor leadership, and these frequently result in reduced performance, absenteeism, high turnover and poor staff morale. These pressures may increase the probability of some staff reacting negatively and research with the OSM has shown that increased levels of strain for small clusters of staff may be a leading indicator of future organizational problems. One of the main benefits of using the OSM is the ability to identify 'hot-spots', where organizational pressures are triggering high levels of personal strain in susceptible clusters of staff. In this way, the OSM may act as an 'early warning alarm' for potential organizational problems.

  15. Practical appraisal of sustainable development-Methodologies for sustainability measurement at settlement level

    International Nuclear Information System (INIS)

    Moles, Richard; Foley, Walter; Morrissey, John; O'Regan, Bernadette

    2008-01-01

    This paper investigates the relationships between settlement size, functionality, geographic location and sustainable development. Analysis was carried out on a sample of 79 Irish settlements, located in three regional clusters. Two methods were selected to model the level of sustainability achieved in settlements, namely, Metabolism Accounting and Modelling of Material and Energy Flows (MA) and Sustainable Development Index Modelling. MA is a systematic assessment of the flows and stocks of material within a system defined in space and time. The metabolism of most settlements is essentially linear, with resources flowing through the urban system. The objective of this research on material and energy flows was to provide information that might aid in the development of a more circular pattern of urban metabolism, vital to sustainable development. In addition to MA, a set of forty indicators were identified and developed. These target important aspects of sustainable development: transport, environmental quality, equity and quality of life issues. Sustainability indices were derived through aggregation of indicators to measure dimensions of sustainable development. Similar relationships between settlement attributes and sustainability were found following both methods, and these were subsequently integrated to provide a single measure. Analysis identified those attributes of settlements preventing, impeding or promoting progress towards sustainability

  16. Theoretical and methodological approaches to the problem of students' health in algorithms of recreation measures.

    Directory of Open Access Journals (Sweden)

    Zaytzev V.P.

    2011-01-01

    Full Text Available In the article is expounded about health and its basic constituents: physical, psychical and social. Description is given to physical development of man and its physical preparedness, physical form and trained, physical activity and functional readiness. Opinions and looks of scientists, teachers and doctors are presented on determination of health of man, including student. All of these symptoms are taken into account from point of recreation measures. Description of determination of recreation, physical recreation and other concept of recreation systems is given. It is shown historical information about both determination of health and recreation, and also participation of higher educational establishments of physical culture of Ukraine, Russia and Poland, which is working under this problem, in determination of health and recreation.

  17. Improving Scene Classifications with Combined Active/Passive Measurements

    Science.gov (United States)

    Hu, Y.; Rodier, S.; Vaughan, M.; McGill, M.

    The uncertainties in cloud and aerosol physical properties derived from passive instruments such as MODIS are not insignificant And the uncertainty increases when the optical depths decrease Lidar observations do much better for the thin clouds and aerosols Unfortunately space-based lidar measurements such as the one onboard CALIPSO satellites are limited to nadir view only and thus have limited spatial coverage To produce climatologically meaningful thin cloud and aerosol data products it is necessary to combine the spatial coverage of MODIS with the highly sensitive CALIPSO lidar measurements Can we improving the quality of cloud and aerosol remote sensing data products by extending the knowledge about thin clouds and aerosols learned from CALIPSO-type of lidar measurements to a larger portion of the off-nadir MODIS-like multi-spectral pixels To answer the question we studied the collocated Cloud Physics Lidar CPL with Modis-Airborne-Simulation MAS observations and established an effective data fusion technique that will be applied in the combined CALIPSO MODIS cloud aerosol product algorithms This technique performs k-mean and Kohonen self-organized map cluster analysis on the entire swath of MAS data as well as on the combined CPL MAS data at the nadir track Interestingly the clusters generated from the two approaches are almost identical It indicates that the MAS multi-spectral data may have already captured most of the cloud and aerosol scene types such as cloud