WorldWideScience

Sample records for measurement process development

  1. MEASUREMENT PROCESS OF SOFTWARE DEVELOPMENT PROJECTS FOR SUPPORTING STRATEGIC BUSINESS OBJECTIVES IN SOFTWARE DEVELOPING COMPANIES

    Directory of Open Access Journals (Sweden)

    Sandra Lais Pedroso

    2013-08-01

    Full Text Available Software developing companies work in a competitive market and are often challenged to make business decisions with impact on competitiveness. Models accessing maturity for software development processes quality, such as CMMI and MPS-BR, comprise process measurements systems (PMS. However, these models are not necessarily suitable to support business decisions, neither to achieve strategic goals. The objective of this work is to analyze how the PMS of software development projects could support business strategies for software developing companies. Results taken from this work show that PMS results from maturity models for software processes can be suited to help evaluating operating capabilities and supporting strategic business decisions.

  2. Developing a lean measurement system to enhance process improvement

    Directory of Open Access Journals (Sweden)

    Lewis P.

    2013-01-01

    Full Text Available A key ingredient to underpin process improvement is a robust, reliable, repeatable measurement system. Process improvement activity needs to be supported by accurate and precise data because effective decision making, within process improvement activity, demands the use of “hard” data. One of the oldest and most established process improvement methods is Deming’s Plan-Do-Check-Act (PDCA cycle which is reliant on the check phase, a measurement activity where data is being gathered and evaluated. Recent expansions of the PDCA such as the Six-Sigma Define-Measure-Analyse-Improve-Control (DMAIC methodology place significant importance upon measurement. The DMAIC cycle incorporates the regimented requirement for the inclusion of measurement system analysis (MSA into the breakthrough strategy. The call for MSA within the DMAIC cycle is to provide the improvement activity with a robust measurement system that will ensure a pertinent level of data during any validation process. The Lean methodology is heavily centred on the removal of the seven Mudas (wastes from a manufacturing process: defects, overproduction, transportation, waiting, inventory, motion and processing. The application of lean, particularly within the manufacturing industry, has led to a perception that measurement is a waste within a manufacturing process because measurement processes identify defective products. The metrologists’ pursuit for measurement excellence could be construed as a hindrance by the “cost down” demands being perpetrated from the same organisation’s lean policy. So what possible benefits does enforcing the regimes of the lean and quality philosophies upon the measurement process have and how does this ultimately enhance the process improvement activity? The key fundamental to embed with any process improvement is the removal of waste. The process improvement techniques embedded within lean and quality concepts are extremely powerful practices in the

  3. Measuring Science Inquiry Skills in Youth Development Programs: The Science Process Skills Inventory

    Directory of Open Access Journals (Sweden)

    Mary E. Arnold

    2013-03-01

    Full Text Available In recent years there has been an increased emphasis on science learning in 4-H and other youth development programs. In an effort to increase science capacity in youth, it is easy to focus only on developing the concrete skills and knowledge that a trained scientist must possess. However, when science learning is presented in a youth-development setting, the context of the program also matters. This paper reports the development and testing of the Science Process Skills Inventory (SPSI and its usefulness for measuring science inquiry skill development in youth development science programs. The results of the psychometric testing of the SPSI indicated the instrument is reliable and measures a cohesive construct called science process skills, as reflected in the 11 items that make up this group of skills. The 11 items themselves are based on the cycle of science inquiry, and represent the important steps of the complete inquiry process.

  4. Development of Industrial Process Diagnosis and Measurement Technology

    International Nuclear Information System (INIS)

    Jung, Sung Hee; Kim, Jong Bum; Moon, Jin Ho

    2010-04-01

    Section 1. Industrial Gamma CT Technology for Process Diagnosis: The project is aimed to develop industrial process gamma tomography system for investigation on structural and physical malfunctioning and process media distribution by means of sealed gamma source and radioactive materials. Section 2. Development of RI Hydraulic Detection Technology for Industrial Application: The objectives in this study are to develop the evaluation technology of the hydrological characteristics and the hydraulic detection technology using radioisotope, and to analyze the hydrodynamics and pollutant transport in water environment like surface and subsurface. Section 3. Development of RT-PAT System for Powder Process Diagnosis: The objective of this project is the development of a new radiation technology to improve the accuracy of the determination of moisture content in a powder sample by using radiation source through the so-called RT-PAT (Radiation Technology-Process Analytical Technology), which is a new concept of converging technology between the radiation technology and the process analytical technology

  5. Development of industrial process diagnosis and measurement technology

    International Nuclear Information System (INIS)

    Jung, Sunghee; Kim, Jongbum; Moon, Jinho; Suh, Kyungsuk; Kim, Jongyun

    2012-04-01

    Section1. Industrial Gamma CT Technology for Process Diagnosis The project is aimed to develop industrial process gamma tomography system for investigation on structural and physical malfunctioning and process media distribution by means of sealed gamma source and radioactive materials. Section2. Development of RI Hydraulic Detection Technology for Industrial Application The objectives in this study are to develop the evaluation technology of the hydrological characteristics and the hydraulic detection technology using radioisotope, and to analyze the hydrodynamics and pollutant transport in water environment like surface and subsurface. Section3. Development of RT-PAT System for Powder Process Diagnosis The objective of this project is the development of a new radiation technology to improve the accuracy of the determination of moisture content in a powder sample by using radiation source through the so-called RT-PAT (Radiation Technology-Process Analytical Technology), which is a new concept of converging technology between the radiation technology and the process analytical technology

  6. Process development

    Energy Technology Data Exchange (ETDEWEB)

    Schuegerl, K

    1984-01-01

    The item 'process development' comprises the production of acetonic/butonal with C. acetobylicum and the yeasting of potato waste. The target is to increase productivity by taking the following measures - optimation of media, on-line process analysis, analysis of reaction, mathematic modelling and identification of parameters, process simulation, development of a state estimator with the help of the on-line process analysis and the model, optimization and adaptive control.

  7. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  8. Developing an instrument to measure consumers' multimedia usage in the purchase process

    NARCIS (Netherlands)

    Voorveld, H.A.M.; Bronner, F.E.; Neijens, P.C.; Smit, E.G.

    2013-01-01

    This article presents a new tool that measures consumers' multimedia behavior in the purchase process. Two variants of the tool were developed that differ in their starting point; one originates with media usage and the other with product purchase. The first variant starts with questions about

  9. Measuring health care process quality with software quality measures.

    Science.gov (United States)

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  10. Expert System Development on On-line Measurement of Sewage Treatment Based Process

    Directory of Open Access Journals (Sweden)

    Jianjun QIN

    2014-02-01

    Full Text Available This article puts forward a solution in which an instrument on-line automatic measurement and expert system process are optimized according to the complexity and great process dynamics of sewage treatment process. Firstly modeling has been set up with configuration sewage treatment process in which the process has been integrated into the computer software environment. Secondly certain number of water quality automatic monitoring instruments and sensor probes are set in the reaction tanks according to the needs of process changes and management. The data information acquired can be displayed and recorded at the real time. A human-machine integration expert system featuring computer automation management is developed for the base by one-off method thus to realize the intelligent and unmanned management. The advantages brought about from it can fill up the inexperience of the on-site management personnel and solve the contradiction between the water quality dynamics and difficulty in the process adjustment.

  11. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...

  12. Development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements

    Science.gov (United States)

    Rey, Charles A.

    1991-03-01

    The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.

  13. Development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements

    Science.gov (United States)

    Rey, Charles A.

    1991-01-01

    The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.

  14. Green chemistry measures for process research and development

    Energy Technology Data Exchange (ETDEWEB)

    Constable, D.J.C.; Curzons, A.D.; Freitas dos Santos, L.M. (and others)

    2001-07-01

    A set of metrics has been developed which enables a simple assessment to be made of batch processes in terms of waste, energy usage, and chemistry efficiency. It is intended to raise awareness of green chemistry by providing a tool to assist chemists in monitoring progress in the reduction of environmental impact as they design new routes and modify processes. (author)

  15. Collaborative measurement development as a tool in CBPR: measurement development and adaptation within the cultures of communities.

    Science.gov (United States)

    Gonzalez, John; Trickett, Edison J

    2014-09-01

    This paper describes the processes we engaged into develop a measurement protocol used to assess the outcomes in a community based suicide and alcohol abuse prevention project with two Alaska Native communities. While the literature on community-based participatory research (CBPR) is substantial regarding the importance of collaborations, few studies have reported on this collaboration in the process of developing measures to assess CBPR projects. We first tell a story of the processes around the standard issues of doing cross-cultural work on measurement development related to areas of equivalence. A second story is provided that highlights how community differences within the same cultural group can affect both the process and content of culturally relevant measurement selection, adaptation, and development.

  16. Performance Measurement in Global Product Development

    DEFF Research Database (Denmark)

    Taylor, Thomas Paul; Ahmed-Kristensen, Saeema

    2013-01-01

    there is a requirement for the process to be monitored and measured relative to the business strategy of an organisation. It was found that performance measurement is a process that helps achieve sustainable business success, encouraging a learning culture within organisations. To this day, much of the research into how...... performance is measured has focussed on the process of product development. However, exploration of performance measurement related to global product development is relatively unexplored and a need for further research is evident. This paper contributes towards understanding how performance is measured...

  17. Empirical evaluation of the Process Overview Measure for assessing situation awareness in process plants.

    Science.gov (United States)

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-03-01

    The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.

  18. The Development of Advanced Processing and Analysis Algorithms for Improved Neutron Multiplicity Measurements

    International Nuclear Information System (INIS)

    Santi, P.; Favalli, A.; Hauck, D.; Henzl, V.; Henzlova, D.; Ianakiev, K.; Iliev, M.; Swinhoe, M.; Croft, S.; Worrall, L.

    2015-01-01

    One of the most distinctive and informative signatures of special nuclear materials is the emission of correlated neutrons from either spontaneous or induced fission. Because the emission of correlated neutrons is a unique and unmistakable signature of nuclear materials, the ability to effectively detect, process, and analyze these emissions will continue to play a vital role in the non-proliferation, safeguards, and security missions. While currently deployed neutron measurement techniques based on 3He proportional counter technology, such as neutron coincidence and multiplicity counters currently used by the International Atomic Energy Agency, have proven to be effective over the past several decades for a wide range of measurement needs, a number of technical and practical limitations exist in continuing to apply this technique to future measurement needs. In many cases, those limitations exist within the algorithms that are used to process and analyze the detected signals from these counters that were initially developed approximately 20 years ago based on the technology and computing power that was available at that time. Over the past three years, an effort has been undertaken to address the general shortcomings in these algorithms by developing new algorithms that are based on fundamental physics principles that should lead to the development of more sensitive neutron non-destructive assay instrumentation. Through this effort, a number of advancements have been made in correcting incoming data for electronic dead time, connecting the two main types of analysis techniques used to quantify the data (Shift register analysis and Feynman variance to mean analysis), and in the underlying physical model, known as the point model, that is used to interpret the data in terms of the characteristic properties of the item being measured. The current status of the testing and evaluation of these advancements in correlated neutron analysis techniques will be discussed

  19. Development of clinical process measures for pediatric burn care: Understanding variation in practice patterns.

    Science.gov (United States)

    Kazis, Lewis E; Sheridan, Robert L; Shapiro, Gabriel D; Lee, Austin F; Liang, Matthew H; Ryan, Colleen M; Schneider, Jeffrey C; Lydon, Martha; Soley-Bori, Marina; Sonis, Lily A; Dore, Emily C; Palmieri, Tina; Herndon, David; Meyer, Walter; Warner, Petra; Kagan, Richard; Stoddard, Frederick J; Murphy, Michael; Tompkins, Ronald G

    2018-04-01

    There has been little systematic examination of variation in pediatric burn care clinical practices and its effect on outcomes. As a first step, current clinical care processes need to be operationally defined. The highly specialized burn care units of the Shriners Hospitals for Children system present an opportunity to describe the processes of care. The aim of this study was to develop a set of process-based measures for pediatric burn care and examine adherence to them by providers in a cohort of pediatric burn patients. We conducted a systematic literature review to compile a set of process-based indicators. These measures were refined by an expert panel of burn care providers, yielding 36 process-based indicators in four clinical areas: initial evaluation and resuscitation, acute excisional surgery and critical care, psychosocial and pain control, and reconstruction and aftercare. We assessed variability in adherence to the indicators in a cohort of 1,076 children with burns at four regional pediatric burn programs in the Shriners Hospital system. The percentages of the cohort at each of the four sites were as follows: Boston, 20.8%; Cincinnati, 21.1%; Galveston, 36.0%; and Sacramento, 22.1%. The cohort included children who received care between 2006 and 2010. Adherence to the process indicators varied both across sites and by clinical area. Adherence was lowest for the clinical areas of acute excisional surgery and critical care, with a range of 35% to 48% across sites, followed by initial evaluation and resuscitation (range, 34%-60%). In contrast, the clinical areas of psychosocial and pain control and reconstruction and aftercare had relatively high adherence across sites, with ranges of 62% to 93% and 71% to 87%, respectively. Of the 36 process indicators, 89% differed significantly in adherence between clinical sites (p measures represents an important step in the assessment of clinical practice in pediatric burn care. Substantial variation was observed

  20. Accreditation and the Development of Process Performance Measures

    DEFF Research Database (Denmark)

    Bie Bogh, Søren

    Accreditation is an external review process used to assess how well an organisation performs relative to established standards. Accreditation provides a framework for continuous quality improvement, and health services worldwide embrace accreditation and use it as a strategy to improve quality...... on quality of care using nationwide quantitative designs aimed at detecting changes over time in hospital performance in relation to both voluntary (Study 1) and mandatory accreditation (Study 2). Further, a qualitative study (Study 3) was conducted to complement the findings in Study 2. To examine...... was used to examine the mandatory accreditation programme. The quantitative study was a multilevel, longitudinal, stepped-wedge, nationwide study of process performance measures based on data from patients admitted for acute stroke, heart failure, ulcer, diabetes, breast cancer and lung cancer...

  1. Process for measuring residual stresses

    International Nuclear Information System (INIS)

    Elfinger, F.X.; Peiter, A.; Theiner, W.A.; Stuecker, E.

    1982-01-01

    No single process can at present solve all problems. The complete destructive processes only have a limited field of application, as the component cannot be reused. However, they are essential for the basic determination of stress distributions in the field of research and development. Destructive and non-destructive processes are mainly used if investigations have to be carried out on original components. With increasing component size, the part of destructive tests becomes smaller. The main applications are: quality assurance, testing of manufactured parts and characteristics of components. Among the non-destructive test procedures, X-raying has been developed most. It gives residual stresses on the surface and on surface layers near the edges. Further development is desirable - in assessment - in measuring techniques. Ultrasonic and magnetic crack detection processes are at present mainly used in research and development, and also in quality assurance. Because of the variable depth of penetration and the possibility of automation they are gaining in importance. (orig./RW) [de

  2. An alternative method to achieve metrological confirmation in measurement process

    Science.gov (United States)

    Villeta, M.; Rubio, E. M.; Sanz, A.; Sevilla, L.

    2012-04-01

    Metrological confirmation process must be designed and implemented to ensure that metrological characteristics of the measurement system meet metrological requirements of the measurement process. The aim of this paper is to present an alternative method to the traditional metrological requirements about the relationship between tolerance and measurement uncertainty, to develop such confirmation processes. The proposed way to metrological confirmation considers a given inspection task of the measurement process into the manufacturing system, and it is based on the Index of Contamination of the Capability, ICC. Metrological confirmation process is then developed taking into account the producer risks and economic considerations on this index. As a consequence, depending on the capability of the manufacturing process, the measurement system will be or will not be in adequate state of metrological confirmation for the measurement process.

  3. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  4. Validation of measured friction by process tests

    DEFF Research Database (Denmark)

    Eriksen, Morten; Henningsen, Poul; Tan, Xincai

    The objective of sub-task 3.3 is to evaluate under actual process conditions the friction formulations determined by simulative testing. As regards task 3.3 the following tests have been used according to the original project plan: 1. standard ring test and 2. double cup extrusion test. The task...... has, however, been extended to include a number of new developed process tests: 3. forward rod extrusion test, 4. special ring test at low normal pressure, 5. spike test (especially developed for warm and hot forging). Validation of the measured friction values in cold forming from sub-task 3.1 has...... been made with forward rod extrusion, and very good agreement was obtained between the measured friction values in simulative testing and process testing....

  5. Human Factors in Software Development Processes: Measuring System Quality

    DEFF Research Database (Denmark)

    Abrahão, Silvia; Baldassarre, Maria Teresa; Caivano, Danilo

    2016-01-01

    Software Engineering and Human-Computer Interaction look at the development process from different perspectives. They apparently use very different approaches, are inspired by different principles and address different needs. But, they definitively have the same goal: develop high quality software...... in the most effective way. The second edition of the workshop puts particular attention on efforts of the two communities in enhancing system quality. The research question discussed is: who, what, where, when, why, and how should we evaluate?...

  6. Automated measurement of pressure injury through image processing.

    Science.gov (United States)

    Li, Dan; Mathews, Carol

    2017-11-01

    To develop an image processing algorithm to automatically measure pressure injuries using electronic pressure injury images stored in nursing documentation. Photographing pressure injuries and storing the images in the electronic health record is standard practice in many hospitals. However, the manual measurement of pressure injury is time-consuming, challenging and subject to intra/inter-reader variability with complexities of the pressure injury and the clinical environment. A cross-sectional algorithm development study. A set of 32 pressure injury images were obtained from a western Pennsylvania hospital. First, we transformed the images from an RGB (i.e. red, green and blue) colour space to a YC b C r colour space to eliminate inferences from varying light conditions and skin colours. Second, a probability map, generated by a skin colour Gaussian model, guided the pressure injury segmentation process using the Support Vector Machine classifier. Third, after segmentation, the reference ruler - included in each of the images - enabled perspective transformation and determination of pressure injury size. Finally, two nurses independently measured those 32 pressure injury images, and intraclass correlation coefficient was calculated. An image processing algorithm was developed to automatically measure the size of pressure injuries. Both inter- and intra-rater analysis achieved good level reliability. Validation of the size measurement of the pressure injury (1) demonstrates that our image processing algorithm is a reliable approach to monitoring pressure injury progress through clinical pressure injury images and (2) offers new insight to pressure injury evaluation and documentation. Once our algorithm is further developed, clinicians can be provided with an objective, reliable and efficient computational tool for segmentation and measurement of pressure injuries. With this, clinicians will be able to more effectively monitor the healing process of pressure

  7. Hospital process orientation from an operations management perspective: development of a measurement tool and practical testing in three ophthalmic practices.

    Science.gov (United States)

    Gonçalves, Pedro D; Hagenbeek, Marie Louise; Vissers, Jan M H

    2013-11-13

    Although research interest in hospital process orientation (HPO) is growing, the development of a measurement tool to assess process orientation (PO) has not been very successful yet. To view a hospital as a series of processes organized around patients with a similar demand seems to be an attractive proposition, but it is hard to operationalize this idea in a measurement tool that can actually measure the level of PO. This research contributes to HPO from an operations management (OM) perspective by addressing the alignment, integration and coordination of activities within patient care processes. The objective of this study was to develop and practically test a new measurement tool for assessing the degree of PO within hospitals using existing tools. Through a literature search we identified a number of constructs to measure PO in hospital settings. These constructs were further operationalized, using an OM perspective. Based on five dimensions of an existing questionnaire a new HPO-measurement tool was developed to measure the degree of PO within hospitals on the basis of respondents' perception. The HPO-measurement tool was pre-tested in a non-participating hospital and discussed with experts in a focus group. The multicentre exploratory case study was conducted in the ophthalmic practices of three different types of Dutch hospitals. In total 26 employees from three disciplines participated. After filling in the questionnaire an interview was held with each participant to check the validity and the reliability of the measurement tool. The application of the HPO-measurement tool, analysis of the scores and interviews with the participants resulted in the possibility to identify differences of PO performance and the areas of improvement--from a PO point of view--within each hospital. The result of refinement of the items of the measurement tool after practical testing is a set of 41 items to assess the degree of PO from an OM perspective within hospitals. The

  8. Implementing a Process to Measure Return on Investment for Nursing Professional Development.

    Science.gov (United States)

    Garrison, Elisabeth; Beverage, Jodie

    Return on investment (ROI) is one way to quantify the value that nursing professional development brings to the organization. This article describes a process to begin tracking ROI for nursing professional development. Implementing a process of tracking nursing professional development practitioners' ROI increased awareness of the financial impact and effectiveness of the department.

  9. Measuring process performance within healthcare logistics - a decision tool for selecting measuring technologies

    DEFF Research Database (Denmark)

    Feibert, Diana Cordes; Jacobsen, Peter

    2015-01-01

    Performance measurement can support the organization in improving the efficiency and effectiveness of logistical healthcare processes. Selecting the most suitable technologies is important to ensure data validity. A case study of the hospital cleaning process at a public Danish hospital...... was conducted. Monitoring tasks and ascertaining quality of work is difficult in such a process. Based on principal-agent theory, a set of decision indicator has been developed, and a decision framework for assessing technologies to enable performance measurement has been proposed....

  10. Quark structure functions measured with the Drell-Yan process

    International Nuclear Information System (INIS)

    Garvey, G.T.

    1986-01-01

    The physics relevant to showing that the Drell-Yan process offers the possibility for measuring flavor specific quark momentum distributions of free hadrons and their possible modification in nuclei are presented. The case for flavor specific measurements via use of the Drell-Yan process is developed. 21 refs

  11. The consultation and relational empathy (CARE) measure: development and preliminary validation and reliability of an empathy-based consultation process measure.

    Science.gov (United States)

    Mercer, Stewart W; Maxwell, Margaret; Heaney, David; Watt, Graham Cm

    2004-12-01

    Empathy is a key aspect of the clinical encounter but there is a lack of patient-assessed measures suitable for general clinical settings. Our aim was to develop a consultation process measure based on a broad definition of empathy, which is meaningful to patients irrespective of their socio-economic background. Qualitative and quantitative approaches were used to develop and validate the new measure, which we have called the consultation and relational empathy (CARE) measure. Concurrent validity was assessed by correlational analysis against other validated measures in a series of three pilot studies in general practice (in areas of high or low socio-economic deprivation). Face and content validity was investigated by 43 interviews with patients from both types of areas, and by feedback from GPs and expert researchers in the field. The initial version of the new measure (pilot 1; high deprivation practice) correlated strongly (r = 0.85) with the Reynolds empathy measure (RES) and the Barrett-Lennard empathy subscale (BLESS) (r = 0.63), but had a highly skewed distribution (skew -1.879, kurtosis 3.563). Statistical analysis, and feedback from the 20 patients interviewed, the GPs and the expert researchers, led to a number of modifications. The revised, second version of the CARE measure, tested in an area of low deprivation (pilot 2) also correlated strongly with the established empathy measures (r = 0.84 versus RES and r = 0.77 versus BLESS) but had a less skewed distribution (skew -0.634, kurtosis -0.067). Internal reliability of the revised version was high (Cronbach's alpha 0.92). Patient feedback at interview (n = 13) led to only minor modification. The final version of the CARE measure, tested in pilot 3 (high deprivation practice) confirmed the validation with the other empathy measures (r = 0.85 versus RES and r = 0.84 versus BLESS) and the face validity (feedback from 10 patients). These preliminary results support the validity and reliability of the CARE

  12. Development of a strain measurement method for non-plane specimens by means of computer picture processing

    International Nuclear Information System (INIS)

    Yoshioka, Akira; Soneda, Naoki; Yagawa, Genki; Miyoshi, Akio.

    1988-01-01

    Integrity Tests of the Fast Breeder Reactor components are often conducted at an elevated temperature, say 550deg C. Since high-temperature strain measurement using special strain gauges is costly and unappropriate for large and repeated strains, the authors have developed an optical strain measurement method and system based on computer picture processing and the triangulation principle. The present method enables us to measure the strain in specimen with curved surfaces. Its operation is also easy, because of the automatic distinction of marks from noises. The verification tests with a plate specimen and a cylindrical one are performed under elevated temperatures. The results show that the present method is very suitable to the tests under elevated temperatures and that the measurement error of strain is within 0.2 % (2000μ), which is reasonable considering the limitation of hardware. (author)

  13. The (mis)use of subjective process measures in software engineering

    Science.gov (United States)

    Valett, Jon D.; Condon, Steven E.

    1993-01-01

    A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.

  14. FY 1998 annual summary report on photon measuring/processing techniques. Development of the techniques for high-efficiency production processes; 1998 nendo foton keisoku kako gijutsu seika hokokusho. Kokoritsu seisan process gijutsu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The objectives are set to develop the techniques for energy-efficient laser-aided processing; techniques for high-precision, real-time measurement to improve quality control for production processes and increase their efficiency; and the techniques for generating/controlling photon of high efficiency and quality as the laser beam sources therefor, in order to promote energy saving at and improve efficiency of production processes consuming large quantities of energy, e.g., welding, joining, surface treatment and production of fine particles. The R and D themes are microscopic processing technology: simulation technology for laser welding phenomena; microscopic processing technology: synthesis of technology for quantum dot functional structures; in-situ status measuring technology: fine particle elements and size measurement technology; high-power all-solid-state laser technology: efficient rod type LD-pumping laser modules and pumping chamber of a slab-type laser; tightly-focusing all-solid-state laser technology: improvement of E/O efficiency of laser diode, high-quality nonlinear crystal growth technology and fabrication technology for nonlinear crystal; and comprehensive investigation of photonics engineering: high-efficiency harmonic generation technology. (NEDO)

  15. Measurement and Management of the Level of Quality Control Process in SoC (System on Chip Embedded Software Development

    Directory of Open Access Journals (Sweden)

    Ki-Won Song

    2012-04-01

    Full Text Available This paper presents the process of measuring the level of quality control process to ensure the quality of delivered software package during the development cycle. The success of the project requires three pre-requisites and they constrain one another. Quality is the most important factor for successful project completion. In other words, quality should not be sacrificed for the sake of meeting cost budget or delivering within schedule. Also, cost caused by any quality issues such as defect resolution increases exponentially once the product is out of the door. Having said that, we also have to consider the schedule side of constraints for the successful project. In other words, we have no time to do a quality job and we have to compete with other competitors to ship the product to the market earlier than them. So, the quality measurement and management concept is introduced to meet the agile software development environment in conjunction with performance strategies to execute within organization. Obviously, there are many key performance indexes derivable from the actual data associated with quality control activities and it is desirable to create a quality process to integrally represent overall level of quality control activities performed while developing the software deliverables. With the quality process, it is possible to evaluate whether enough quality control activities are performed for the project officially and secure the quality of the software deliverables before it is delivered to the customers.

  16. Measuring health literacy in populations: illuminating the design and development process of the European Health Literacy Survey Questionnaire (HLS-EU-Q).

    Science.gov (United States)

    Sørensen, Kristine; Van den Broucke, Stephan; Pelikan, Jürgen M; Fullam, James; Doyle, Gerardine; Slonska, Zofia; Kondilis, Barbara; Stoffels, Vivian; Osborne, Richard H; Brand, Helmut

    2013-10-10

    Several measurement tools have been developed to measure health literacy. The tools vary in their approach and design, but few have focused on comprehensive health literacy in populations. This paper describes the design and development of the European Health Literacy Survey Questionnaire (HLS-EU-Q), an innovative, comprehensive tool to measure health literacy in populations. Based on a conceptual model and definition, the process involved item development, pre-testing, field-testing, external consultation, plain language check, and translation from English to Bulgarian, Dutch, German, Greek, Polish, and Spanish. The development process resulted in the HLS-EU-Q, which entailed two sections, a core health literacy section and a section on determinants and outcomes associated to health literacy. The health literacy section included 47 items addressing self-reported difficulties in accessing, understanding, appraising and applying information in tasks concerning decisions making in healthcare, disease prevention, and health promotion. The second section included items related to, health behaviour, health status, health service use, community participation, socio-demographic and socio-economic factors. By illuminating the detailed steps in the design and development process of the HLS-EU-Q, it is the aim to provide a deeper understanding of its purpose, its capability and its limitations for others using the tool. By stimulating a wide application it is the vision that HLS-EU-Q will be validated in more countries to enhance the understanding of health literacy in different populations.

  17. Technology development life cycle processes.

    Energy Technology Data Exchange (ETDEWEB)

    Beck, David Franklin

    2013-05-01

    This report and set of appendices are a collection of memoranda originally drafted in 2009 for the purpose of providing motivation and the necessary background material to support the definition and integration of engineering and management processes related to technology development. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. As presented herein, the material begins with a survey of open literature perspectives on technology development life cycles, including published data on %E2%80%9Cwhat went wrong.%E2%80%9D The main thrust of the material presents a rational expose%CC%81 of a structured technology development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of the systems engineering process. The material concludes with a discussion on the use of multiple measures to assess technology maturity, including consideration of the viewpoint of potential users.

  18. Image processing system for flow pattern measurements

    International Nuclear Information System (INIS)

    Ushijima, Satoru; Miyanaga, Yoichi; Takeda, Hirofumi

    1989-01-01

    This paper describes the development and application of an image processing system for measurements of flow patterns occuring in natural circulation water flows. In this method, the motions of particles scattered in the flow are visualized by a laser light slit and they are recorded on normal video tapes. These image data are converted to digital data with an image processor and then transfered to a large computer. The center points and pathlines of the particle images are numerically analized, and velocity vectors are obtained with these results. In this image processing system, velocity vectors in a vertical plane are measured simultaneously, so that the two dimensional behaviors of various eddies, with low velocity and complicated flow patterns usually observed in natural circulation flows, can be determined almost quantitatively. The measured flow patterns, which were obtained from natural circulation flow experiments, agreed with photographs of the particle movements, and the validity of this measuring system was confirmed in this study. (author)

  19. Adult Personality Development: Dynamics and Processes

    OpenAIRE

    Diehl, Manfred; Hooker, Karen

    2013-01-01

    The focus of this special issue of Research in Human Development is on adult personality and how personality may contribute to and be involved in adult development. Specifically, the contributions in this issue focus on the links between personality structures (e.g., traits) and personality processes (e.g., goal pursuit, self--regulation) and emphasize the contributions that intensive repeated measurement approaches can make to the understanding of personality and development across the adult...

  20. MEASUREMENT OF QUALITY MANAGEMENT SYSTEM PERFORMANCE IN MEAT PROCESSING

    Directory of Open Access Journals (Sweden)

    Elena S. Voloshina

    2017-01-01

    Full Text Available Modern methods aimed to ensure the quality of foods require to implement and certify quality management systems in processing plants. In this case, to measure the effectiveness of existing QMS is often a very difficult task for the leadership due to the fragmentation of the measured metrics, or even lack thereof. This points to the relevance of the conducted research.The criteria for effectiveness assessment of the production process of meat processing plants with the use of scaling methods and Shewhart control charts are presented in the article. The authors developed and presented the formulae for the calculation of single indicators used for the further comprehensive assessment. The algorithm of statistical evaluation of the process controllability, which allows in an accessible form to estimate the statistical control of production processes and to organize statistical quality control in the development of quality management systems, is presented The proposed procedure is based on a process approach, the essence of which is the application of the Deming cycle: “Plan — Do — Check — Act”, which makes it easy to integrate it into any existing quality management system.

  1. Development of a measure of hypodontia patients' expectations of the process and outcome of combined orthodontic and restorative treatment.

    Science.gov (United States)

    Gassem, Afnan Ben; Foxton, Richard; Bister, Dirk; Newton, Tim

    2016-12-01

    To devise and assess the psychometric properties of a measure that investigates hypodontia patients' expectations of the process and outcome of combined orthodontic/restorative treatment. Specialised secondary care facility for individuals with hypodontia. Mixed research design with three phases: (a) Thematic analysis of data from individual interviews with 25 hypodontia patients/16 parents to generate the questionnaire items. (b) Questionnaire design, assessment of readability and face/content validity with 10 patients. (c) Survey of 32 new hypodontia patients to determine the internal consistency of the measure. Three main themes related to the treatment process emerged from the qualitative data: 'hypodontia clinic', 'orthodontic treatment' and 'restorative treatment'. Three main themes were also revealed relating to treatment outcome: 'changes in appearance', 'psychosocial changes' and 'functional changes'. A 28 item questionnaire was constructed using a mix of visual analogue scale (VAS) and categorical response format. The Flesch reading ease score of the measure was 78, equivalent to a reading age of 9-10 years. Face and content validity were good. The overall Cronbach's alpha was 0.80 while for the treatment process and treatment outcome subscales it was 0.71 and 0.88 respectively. A patient-based measure of the process and outcome of combined orthodontic/restorative treatment for hypodontia patients has been developed which has good face and construct validity and satisfactory internal consistency. Patient expectations of treatment are important in determining not only their satisfaction with treatment outcomes but also their engagement with the clinical process. This questionnaire is a first step in operationalising the expectations of hypodontia patients through assessment tools that can then determine whether pre-treatment counselling is required and aid the consent and treatment planning process, thus improving the quality of treatment provided by

  2. Development of a Model for Measuring Scientific Processing Skills Based on Brain-Imaging Technology: Focused on the Experimental Design Process

    Science.gov (United States)

    Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju

    2014-01-01

    The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…

  3. Development of a universally applicable household food insecurity measurement tool: process, current status, and outstanding issues.

    Science.gov (United States)

    Swindale, Anne; Bilinsky, Paula

    2006-05-01

    The United States Public Law 480 Title II food aid program is the largest U.S. government program directed at reducing hunger, malnutrition, and food insecurity in the developing world. USAID and Title II implementing partners face challenges in measuring the success of Title II programs in reducing household food insecurity because of the technical difficulty and cost of collecting and analyzing data on traditional food security indicators, such as per capita income and caloric adequacy. The Household Food Insecurity Access Scale (HFIAS) holds promise as an easier and more user-friendly approach for measuring the access component of household food security. To support the consistent and comparable collection of the HFIAS, efforts are under way to develop a guide with a standardized questionnaire and data collection and analysis instructions. A set of domains have been identified that is deemed to capture the universal experience of the access component of household food insecurity across countries and cultures. Based on these domains, a set of questions has been developed with wording that is deemed to be universally appropriate, with minor adaptation to local contexts. These underlying suppositions, based on research in multiple countries, are being verified by potential users of the guide. The key remaining issue relates to the process for creating a categorical indicator of food insecurity status from the HFIAS.

  4. An introduction to branching measure-valued processes

    CERN Document Server

    Dynkin, Eugene B

    1994-01-01

    For about half a century, two classes of stochastic processes-Gaussian processes and processes with independent increments-have played an important role in the development of stochastic analysis and its applications. During the last decade, a third class-branching measure-valued (BMV) processes-has also been the subject of much research. A common feature of all three classes is that their finite-dimensional distributions are infinitely divisible, allowing the use of the powerful analytic tool of Laplace (or Fourier) transforms. All three classes, in an infinite-dimensional setting, provide means for study of physical systems with infinitely many degrees of freedom. This is the first monograph devoted to the theory of BMV processes. Dynkin first constructs a large class of BMV processes, called superprocesses, by passing to the limit from branching particle systems. Then he proves that, under certain restrictions, a general BMV process is a superprocess. A special chapter is devoted to the connections between ...

  5. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    ), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.

  6. Remote online process measurements by a fiber optic diode array spectrometer

    International Nuclear Information System (INIS)

    Van Hare, D.R.; Prather, W.S.; O'Rourke, P.E.

    1986-01-01

    The development of remote online monitors for radioactive process streams is an active research area at the Savannah River Laboratory (SRL). A remote offline spectrophotometric measurement system has been developed and used at the Savannah River Plant (SRP) for the past year to determine the plutonium concentration of process solution samples. The system consists of a commercial diode array spectrophotometer modified with fiber optic cables that allow the instrument to be located remotely from the measurement cell. Recently, a fiber optic multiplexer has been developed for this instrument, which allows online monitoring of five locations sequentially. The multiplexer uses a motorized micrometer to drive one of five sets of optical fibers into the optical path of the instrument. A sixth optical fiber is used as an external reference and eliminates the need to flush out process lines to re-reference the spectrophotometer. The fiber optic multiplexer has been installed in a process prototype facility to monitor uranium loading and breakthrough of ion exchange columns. The design of the fiber optic multiplexer is discussed and data from the prototype facility are presented to demonstrate the capabilities of the measurement system

  7. Foundations for in vivo nano-scale measurement of memory processes.

    Energy Technology Data Exchange (ETDEWEB)

    Forsythe, James Chris

    2006-09-01

    An ongoing program of research and development is utilizing nanomaterials as a basis for observing and measuring neurophysiological processes. Work commencing in fiscal year 2007 will focus on expanding current capabilities to create nanoelectrode arrays that will allow nanoscale measurement of the activity of 10's to 100's of neurons. This development is a vital step in gaining scientific insights concerning network properties associated with neural representations and processes. Specifically, attention will be focused the representation of memory in the hippocampus, for which extensive research has been conducted using laboratory rats. This report summarizes background research providing a foundation for work planned for fiscal year 2007 and beyond. In particular, the neuroanatomy and neurophysiology of the hippocampus is described. Additionally, several programs of research are described that have addressed the relationship between neurophysiological processes and behavioral measures of memory performance. These studies provide insight into methodological and analytic approaches for studying the representation of memory processes in the hippocampus. The objective of this report is to document relevant literature in a reference document that will support future research in this area.

  8. Development of laser materials processing and laser metrology techniques

    International Nuclear Information System (INIS)

    Kim, Cheol Jung; Chung, Chin Man; Kim, Jeong Mook; Kim, Min Suk; Kim, Kwang Suk; Baik, Sung Hoon; Kim, Seong Ouk; Park, Seung Kyu

    1997-09-01

    The applications of remote laser materials processing and metrology have been investigated in nuclear industry from the beginning of laser invention because they can reduce the risks of workers in the hostile environment by remote operation. The objective of this project is the development of laser material processing and metrology techniques for repairing and inspection to improve the safety of nuclear power plants. As to repairing, we developed our own laser sleeve welding head and innovative optical laser weld monitoring techniques to control the sleeve welding process. Furthermore, we designed and fabricated a 800 W Nd:YAG and a 150 W Excimer laser systems for high power laser materials processing in nuclear industry such as cladding and decontamination. As to inspection, we developed an ESPI and a laser triangulation 3-D profile measurement system for defect detection which can complement ECT and UT inspections. We also developed a scanning laser vibrometer for remote vibration measurement of large structures and tested its performance. (author). 58 refs., 16 tabs., 137 figs

  9. Chemical Processing effects on the radiation doses measured by Film Dosimeter System

    International Nuclear Information System (INIS)

    Mihai, F.

    2009-01-01

    Halide film dosimetry is a quantitative method of measurement of the radiation doses. The fog density and chemical processing of the dosimeter film affect the radiation dose measurement accuracy. This work presents the effect of the developer solution concentration on the response of the dosimetric film which different fog densities. Thus, three batches of film, dosimeters with following fog density 0.312 ± 1.31 %, 0.71 ± 0.59% and 0.77 ± 0.81 %, were irradiated to 137 Cs standard source to dose value of 1mSv. The halide films have been chemical processed at different concentrations of the developer solution: 20 %; 14.29 %; 11.11%; all other physics-chemical conditions in baths of development have been kept constants. Concentration of 20% is considered to be chemical processed standard conditions of the films. In case of the films exposed to 1 mSv dose, optical density recorded on the low fog films processed at 20% developer solution is rather closed of high fog film optical densities processed at 11.11% developer solution concentration. Also, the chemical processing effect on the image contrast was taken into consideration

  10. Developing the Service Template: From measurement to agendas for improvement

    OpenAIRE

    Williams, CS; Saunders, M

    2007-01-01

    Traditional survey based measures of service quality are argued to be problematic when reflecting individual services and turning measurement into action. This paper reviews developments to an alternative measurement approach, the Service Template Process and offers an extension to it. The extended process appears able to measure service users’ and deliverers’ perceptions of service quality independently. It also enables participants to jointly agree an agenda for quality improvement. The e...

  11. Development of DUMAS data processing system

    International Nuclear Information System (INIS)

    Sakamoto, Hiroshi

    1982-01-01

    In the field of nuclear experiments, the speed-up of data processing has been required recently along with the increase of the amount of data per event or the rate of event occurrence per unit time. In the DUMAS project of RCNP, the development of data processing system has been required, which can perform the high speed transfer and processing. The system should transfer the data of 5 multiwire proportional counters and other counters from the laboratory to the counting room at the rate of 1000 events every second, and also should perform considerably complex processes such as histogramming, particle identification, calculation of various polarizations as well as dumping to the secondary memory in the counting room. Furthermore, easy start-up, adjustment, inspection and maintenance and non-special hardware and software should be considered. A system presently being investigated for satisfying the above requirements is described. The main points are as follows: to employ CAMAC system for the interface with readout circuit, to transfer data between the laboratory and the counting room by converting the byte-serial transfer to the bit-serial optical fiber communication, and to unify the data processing computers to the PDP-11 family by connecting two miniature computers. Development of such a data processing system seems to be useful as an preparatory research for the development of NUMATRON measuring instruments. (Wakatsuki, Y.)

  12. DEVELOPMENT OF SIGNAL PROCESSING TOOLS AND HARDWARE FOR PIEZOELECTRIC SENSOR DIAGNOSTIC PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    OVERLY, TIMOTHY G. [Los Alamos National Laboratory; PARK, GYUHAE [Los Alamos National Laboratory; FARRAR, CHARLES R. [Los Alamos National Laboratory

    2007-02-09

    This paper presents a piezoelectric sensor diagnostic and validation procedure that performs in -situ monitoring of the operational status of piezoelectric (PZT) sensor/actuator arrays used in structural health monitoring (SHM) applications. The validation of the proper function of a sensor/actuator array during operation, is a critical component to a complete and robust SHM system, especially with the large number of active sensors typically involved. The method of this technique used to obtain the health of the PZT transducers is to track their capacitive value, this value manifests in the imaginary part of measured electrical admittance. Degradation of the mechanical/electric properties of a PZT sensor/actuator as well as bonding defects between a PZT patch and a host structure can be identified with the proposed procedure. However, it was found that temperature variations and changes in sensor boundary conditions manifest themselves in similar ways in the measured electrical admittances. Therefore, they examined the effects of temperature variation and sensor boundary conditions on the sensor diagnostic process. The objective of this study is to quantify and classify several key characteristics of temperature change and to develop efficient signal processing techniques to account for those variations in the sensor diagnostis process. In addition, they developed hardware capable of making the necessary measurements to perform the sensor diagnostics and to make impedance-based SHM measurements. The paper concludes with experimental results to demonstrate the effectiveness of the proposed technique.

  13. Development of a Novel Contamination Resistant Ion Chamber for Process Tritium Measurement and Use in the JET First Trace Tritium Experiment

    International Nuclear Information System (INIS)

    Worth, L.B.C.; Pearce, R.J.H.; Bruce, J.; Banks, J.; Scales, S.

    2005-01-01

    The accuracy of process measurements of tritium with conventional ion chambers is often affected by surface tritium contamination. The measurement of tritium in the exhaust of the JET torus is particularly difficult due to surface contamination with highly tritiated hydrocarbons. JET's first unsuccessful attempt to overcome the contamination problem was to use an ion chamber, with a heating element as the chamber wall so that it could be periodically decontaminated by baking. The newly developed ion chamber works on the principle of minimising the surface area within the boundary of the anode and cathode.This paper details the design of the ion chamber, which utilises a grid of 50-micron tungsten wire to define the ion chamber wall and the collector electrode. The effective surface area which, by contamination, is able to effect the measurement of tritium within the process gas has been reduced by a factor of ∼200 over a conventional ion chamber. It is concluded that the new process ion chamber enables sensitive accurate tritium measurements free from contamination issues. It will be a powerful new tool for future tritium experiments both to improve tritium tracking and to help in the understanding of tritium retention issues

  14. Consistent and efficient processing of ADCP streamflow measurements

    Science.gov (United States)

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  15. Development of a Self-Rating instrument to Measure Team Situation Awareness

    NARCIS (Netherlands)

    Schraagen, J.M.C.; Koning, L. de; Hof, T.; Dongen, K. van

    2010-01-01

    The goal of this paper is to describe the development of an instrument to measure team situation awareness (TSA). Individual team member SA may or may not be shared through communication processes with other team members. Most existing instruments do not measure these processes but measure TSA as a

  16. Standard-Setting Methods as Measurement Processes

    Science.gov (United States)

    Nichols, Paul; Twing, Jon; Mueller, Canda D.; O'Malley, Kimberly

    2010-01-01

    Some writers in the measurement literature have been skeptical of the meaningfulness of achievement standards and described the standard-setting process as blatantly arbitrary. We argue that standard setting is more appropriately conceived of as a measurement process similar to student assessment. The construct being measured is the panelists'…

  17. Adopting software quality measures for healthcare processes.

    Science.gov (United States)

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  18. Measurements of the gap/displacement and development of the ultrasonic temperature measuring system applied to severe accidents research

    International Nuclear Information System (INIS)

    Koo, Kil Mo; Kang, Kyung Ho; Cho, Young Ro; Park, Rae Jun; Kim, Sang Baik; Sim, Chul Moo

    2001-02-01

    This report, in order to measure quantitative LAVA experimental results, focuses on measuring the gap formed on the lower head vessel using a ultrasonic pulse echo method and neutron radiography, measuring displacement of the lower head vessel using capacitance method, building a measuring system and developing high temperature measurement system using ultrasonic method. The scope of gap measurement and system development using the ultrasonic method is 2-dimensional image processing using tomographical B scan method and 2- and 3-dimensional image processing using C scan methods based on the one dimensional time domain A scan signal. For some test specimen, the gap size is quantitative represented apply C scan methods. The important ultrasonic image processing technique is on the development of accurate position control system. The requirements of the position control system are a contact technique on the test specimen and a fine moving technique. Since the specimen is hemispherical, the contact technique is very difficult. Therefore, the gap measurement using the ultrasonic pulse echo method was applied developing the position controlling scanner system. Along with the ultrasonic method, neutron radiography method using KAERI's neutron source was attempted 4 times and the results are compared. The fine displacement of the hemispherical specimen was measured using a capacitive displacement sensor. The requirements for this measuring technique are fixing of the capacitance sensor to the experimental facilities and a remote control position varying system. This remote control position varying system was manufactured with a electrical motor. The development of a high temperature measuring system using a ultrasonic method the second year plan, is performed with developing a sensor which can measure up to 2300 deg C

  19. Control measurement system in purex process

    International Nuclear Information System (INIS)

    Mani, V.V.S.

    1985-01-01

    The dependence of a bulk facility handling Purex Process on the control measurement system for evaluating the process performance needs hardly be emphasized. process control, Plant control, inventory control and quality control are the four components of the control measurement system. The scope and requirements of each component are different and the measurement methods are selected accordingly. However, each measurement system has six important elements. These are described in detail. The quality assurance programme carried out by the laboratory as a mechanism through which the quality of measurements is regularly tested and stated in quantitative terms is also explained in terms of internal and external quality assurance, with examples. Suggestions for making the control measurement system more responsive to the operational needs in future are also briefly discussed. (author)

  20. Quantitative high throughput analytics to support polysaccharide production process development.

    Science.gov (United States)

    Noyes, Aaron; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Mukhopadhyay, Tarit

    2014-05-19

    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The

  1. Development of X-ray radiography examination technology by image processing method

    Energy Technology Data Exchange (ETDEWEB)

    Min, Duck Kee; Koo, Dae Seo; Kim, Eun Ka [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    Because the dimension of nuclear fuel rods was measured with rapidity and accuracy by X-ray radiography examination, the set-up of image processing system which was composed of 979 CCD-L camera, image processing card and fluorescent lighting was carried out, and the image processing system enabled image processing to perform. The examination technology of X-ray radiography, which enabled dimension measurement of nuclear fuel rods to perform, was developed by image processing method. The result of dimension measurement of standard fuel rod by image processing method was 2% reduction in relative measuring error than that of X-ray radiography film, while the former was better by 100 {approx} 200 {mu}m in measuring accuracy than the latter. (author). 9 refs., 22 figs., 3 tabs.

  2. Measurement of company effectiveness using analytic network process method

    Science.gov (United States)

    Goran, Janjić; Zorana, Tanasić; Borut, Kosec

    2017-07-01

    The sustainable development of an organisation is monitored through the organisation's performance, which beforehand incorporates all stakeholders' requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP) to define the weight factors of the mutual influences of all the important elements of an organisation's strategy. The calculation of an organisation's effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation's business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation's most important measures.

  3. Cultural Differences in the Development of Processing Speed

    Science.gov (United States)

    Kail, Robert V.; McBride-Chang, Catherine; Ferrer, Emilio; Cho, Jeung-Ryeul; Shu, Hua

    2013-01-01

    The aim of the present work was to examine cultural differences in the development of speed of information processing. Four samples of US children ("N" = 509) and four samples of East Asian children ("N" = 661) completed psychometric measures of processing speed on two occasions. Analyses of the longitudinal data indicated…

  4. Materials measurement and accounting in an operating plutonium conversion and purification process. Phase I. Process modeling and simulation

    International Nuclear Information System (INIS)

    Thomas, C.C. Jr.; Ostenak, C.A.; Gutmacher, R.G.; Dayem, H.A.; Kern, E.A.

    1981-04-01

    A model of an operating conversion and purification process for the production of reactor-grade plutonium dioxide was developed as the first component in the design and evaluation of a nuclear materials measurement and accountability system. The model accurately simulates process operation and can be used to identify process problems and to predict the effect of process modifications

  5. Measuring oxidation processes: Atomic oxygen flux monitor

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    Of the existing 95 high-energy accelerators in the world, the Stanford Linear Collider (SLC) at the Stanford Linear Accelerator Center (SLAC) is the only one of the linear-collider type, where electrons and positrons are smashed together at energies of 50 GeV using linear beams instead of beam rings for achieving interactions. Use of a collider eliminates energy losses in the form of x-rays due to the curved trajectory of the rings, a phenomena known as bremsstrauhlung. Because these losses are eliminated, higher interaction energies are reached. Consequently the SLC produced the first Z particle in quantities large enough to allow measurement of its physical properties with some accuracy. SLAC intends to probe still deeper into the structure of matter by next polarizing the electrons in the beam. The surface of the source for these polarized particles, typically gallium arsenide, must be kept clean of contaminants. One method for accomplishing this task requires the oxidation of the surface, from which the oxidized contaminants are later boiled off. The technique requires careful measurement of the oxidation process. SLAC researchers have developed a technique for measuring the atomic oxygen flux in this process. The method uses a silver film on a quartz-crystal, deposition-rate monitor. Measuring the initial oxidation rate of the silver, which is proportional to the atomic oxygen flux, determines a lower limit on that flux in the range of 10 13 to 10 17 atoms per square centimeter per second. Furthermore, the deposition is reversible by exposing the sensor to atomic hydrogen. This technique has wider applications to processes in solid-state and surface physics as well as surface chemistry. In semiconductor manufacturing where a precise thickness of oxide must be deposited, this technique could be used to monitor the critical flux of atomic oxygen in the process

  6. Development of Advanced Spent Fuel Management Process

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Chung Seok; Choi, I. K.; Kwon, S. G. (and others)

    2007-06-15

    As a part of research efforts to develop an advanced spent fuel management process, this project focused on the electrochemical reduction technology which can replace the original Li reduction technology of ANL, and we have successfully built a 20 kgHM/batch scale demonstration system. The performance tests of the system in the ACPF hot cell showed more than a 99% reduction yield of SIMFUEL, a current density of 100 mA/cm{sup 2} and a current efficiency of 80%. For an optimization of the process, the prevention of a voltage drop in an integrated cathode, a minimization of the anodic effect and an improvement of the hot cell operability by a modulation and simplization of the unit apparatuses were achieved. Basic research using a bench-scale system was also carried out by focusing on a measurement of the electrochemical reduction rate of the surrogates, an elucidation of the reaction mechanism, collecting data on the partition coefficients of the major nuclides, quantitative measurement of mass transfer rates and diffusion coefficients of oxygen and metal ions in molten salts. When compared to the PYROX process of INL, the electrochemical reduction system developed in this project has comparative advantages in its application of a flexible reaction mechanism, relatively short reaction times and increased process yields.

  7. Development of Advanced Spent Fuel Management Process

    International Nuclear Information System (INIS)

    Seo, Chung Seok; Choi, I. K.; Kwon, S. G.

    2007-06-01

    As a part of research efforts to develop an advanced spent fuel management process, this project focused on the electrochemical reduction technology which can replace the original Li reduction technology of ANL, and we have successfully built a 20 kgHM/batch scale demonstration system. The performance tests of the system in the ACPF hot cell showed more than a 99% reduction yield of SIMFUEL, a current density of 100 mA/cm 2 and a current efficiency of 80%. For an optimization of the process, the prevention of a voltage drop in an integrated cathode, a minimization of the anodic effect and an improvement of the hot cell operability by a modulation and simplization of the unit apparatuses were achieved. Basic research using a bench-scale system was also carried out by focusing on a measurement of the electrochemical reduction rate of the surrogates, an elucidation of the reaction mechanism, collecting data on the partition coefficients of the major nuclides, quantitative measurement of mass transfer rates and diffusion coefficients of oxygen and metal ions in molten salts. When compared to the PYROX process of INL, the electrochemical reduction system developed in this project has comparative advantages in its application of a flexible reaction mechanism, relatively short reaction times and increased process yields

  8. Process development

    International Nuclear Information System (INIS)

    Zapata G, G.

    1989-01-01

    Process development: The paper describes the organization and laboratory facilities of the group working on radioactive ore processing studies. Contains a review of the carried research and the plans for the next future. A list of the published reports is also presented

  9. In-process and post-process measurements of drill wear for control of the drilling process

    Science.gov (United States)

    Liu, Tien-I.; Liu, George; Gao, Zhiyu

    2011-12-01

    Optical inspection was used in this research for the post-process measurements of drill wear. A precision toolmakers" microscope was used. Indirect index, cutting force, is used for in-process drill wear measurements. Using in-process measurements to estimate the drill wear for control purpose can decrease the operation cost and enhance the product quality and safety. The challenge is to correlate the in-process cutting force measurements with the post-process optical inspection of drill wear. To find the most important feature, the energy principle was used in this research. It is necessary to select only the cutting force feature which shows the highest sensitivity to drill wear. The best feature selected is the peak of torque in the drilling process. Neuro-fuzzy systems were used for correlation purposes. The Adaptive-Network-Based Fuzzy Inference System (ANFIS) can construct fuzzy rules with membership functions to generate an input-output pair. A 1x6 ANFIS architecture with product of sigmoid membership functions can in-process measure the drill wear with an error as low as 0.15%. This is extremely important for control of the drilling process. Furthermore, the measurement of drill wear was performed under different drilling conditions. This shows that ANFIS has the capability of generalization.

  10. Process measures or patient reported experience measures (PREMs) for comparing performance across providers? A study of measures related to access and continuity in Swedish primary care.

    Science.gov (United States)

    Glenngård, Anna H; Anell, Anders

    2018-01-01

    Aim To study (a) the covariation between patient reported experience measures (PREMs) and registered process measures of access and continuity when ranking providers in a primary care setting, and (b) whether registered process measures or PREMs provided more or less information about potential linkages between levels of access and continuity and explaining variables. Access and continuity are important objectives in primary care. They can be measured through registered process measures or PREMs. These measures do not necessarily converge in terms of outcomes. Patient views are affected by factors not necessarily reflecting quality of services. Results from surveys are often uncertain due to low response rates, particularly in vulnerable groups. The quality of process measures, on the other hand, may be influenced by registration practices and are often more easy to manipulate. With increased transparency and use of quality measures for management and governance purposes, knowledge about the pros and cons of using different measures to assess the performance across providers are important. Four regression models were developed with registered process measures and PREMs of access and continuity as dependent variables. Independent variables were characteristics of providers as well as geographical location and degree of competition facing providers. Data were taken from two large Swedish county councils. Findings Although ranking of providers is sensitive to the measure used, the results suggest that providers performing well with respect to one measure also tended to perform well with respect to the other. As process measures are easier and quicker to collect they may be looked upon as the preferred option. PREMs were better than process measures when exploring factors that contributed to variation in performance across providers in our study; however, if the purpose of comparison is continuous learning and development of services, a combination of PREMs and

  11. Free energy surfaces from nonequilibrium processes without work measurement

    Science.gov (United States)

    Adib, Artur B.

    2006-04-01

    Recent developments in statistical mechanics have allowed the estimation of equilibrium free energies from the statistics of work measurements during processes that drive the system out of equilibrium. Here a different class of processes is considered, wherein the system is prepared and released from a nonequilibrium state, and no external work is involved during its observation. For such "clamp-and-release" processes, a simple strategy for the estimation of equilibrium free energies is offered. The method is illustrated with numerical simulations and analyzed in the context of tethered single-molecule experiments.

  12. KPIs for measuring the sustainability performance of ecodesign implementation into product development and related processes: a systematic literature review

    DEFF Research Database (Denmark)

    Rodrigues, Vinicius Picanco; Pigosso, Daniela Cristina Antelmi; McAloone, Tim C.

    , many difficulties still surround the implementation and management of ecodesign. The main challenges in embedding ecodesign into PDRP are: (i) the lack of support to select key performance indicators (KPI) to measure how well a company is being successful in ecodesign integration from a product......’s impact on the overall corporate behaviour.This research aims at presenting a comprehensive set of sustainability KPI to measure the ecodesign implementation into the PDRP by systematically reviewing the relevant literature regarding sustainability KPIs (social, economic and environmental dimensions......). The underlying research question is “which arethe KPIs for measuring sustainability of ecodesign integration into the product development and related processes?” This research excludes the indicators dealing directly and exclusively with product’s attributes and properties, such as energy and material...

  13. Development of the Concise Data Processing Assessment

    Directory of Open Access Journals (Sweden)

    James Day

    2011-06-01

    Full Text Available The Concise Data Processing Assessment (CDPA was developed to probe student abilities related to the nature of measurement and uncertainty and to handling data. The diagnostic is a ten question, multiple-choice test that can be used as both a pre-test and post-test. A key component of the development process was interviews with students, which were used to both uncover common modes of student thinking and validate item wording. To evaluate the reliability and discriminatory power of this diagnostic, we performed statistical tests focusing on both item analysis (item difficulty index, item discrimination index, and point-biserial coefficient and on the entire test (test reliability and Ferguson’s delta. Scores on the CDPA range from chance (for novices to about 80% (for experts, indicating that it possesses good dynamic range. Overall, the results indicate that the CDPA is a reliable assessment tool for measuring targeted abilities in undergraduate physics students.

  14. Validating a Measure of Stages of Change in Career Development

    Science.gov (United States)

    Hammond, Marie S.; Michael, Tony; Luke, Charles

    2017-01-01

    Research on the processes of change in career development has focused on developmental stages rather than processes. This manuscript reports on the development and validation of the stages of change-career development scale, adapted from McConnaughy, Prochaska, & Velicer (1983) measure of stages of change in psychotherapy. Data from 875…

  15. An exploratory survey of methods used to develop measures of performance

    Science.gov (United States)

    Hamner, Kenneth L.; Lafleur, Charles A.

    1993-09-01

    Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.

  16. Assessing the Process of Retirement: a Cross-Cultural Review of Available Measures.

    Science.gov (United States)

    Rafalski, Julia C; Noone, Jack H; O'Loughlin, Kate; de Andrade, Alexsandro L

    2017-06-01

    Retirement research is now expanding beyond the post-World War II baby boomers' retirement attitudes and plans to include the nature of their workforce exit and how successfully they adjust to their new life. These elements are collectively known as the process of retirement. However, there is insufficient research in developing countries to inform the management of their ageing populations regarding this process. This review aims to facilitate national and cross-cultural research in developing and non-English speaking countries by reviewing the existing measures of the retirement process published in English and Portuguese. The review identified 28 existing measures assessing retirement attitudes, planning, decision making, adjustment and satisfaction with retirement. Information on each scale's item structure, internal reliability, grammatical structure and evidence of translations to other languages is presented. Of the 28 measures, 20 assessed retirement attitudes, plans and decision-making, 5 assessed adjustment to retirement and only two assessed retirement satisfaction. Only eight of the 28 scales had been translated into languages other than English. There is scope to translate measures of retirement attitudes and planning into other languages. However there is a paucity of translated measures of retirement decision-making and adjustment, and measures of retirement satisfaction in general. Within the limitations of this review, researchers are provided with the background to decide between translating existing measures or developing of more culturally appropriate assessment tools for addressing their research questions.

  17. On the Current Measurement Practices in Agile Software Development

    OpenAIRE

    Javdani, Taghi; Zulzalil, Hazura; Ghani, Abdul Azim Abd; Sultan, Abu Bakar Md; Parizi, Reza Meimandi

    2013-01-01

    Agile software development (ASD) methods were introduced as a reaction to traditional software development methods. Principles of these methods are different from traditional methods and so there are some different processes and activities in agile methods comparing to traditional methods. Thus ASD methods require different measurement practices comparing to traditional methods. Agile teams often do their projects in the simplest and most effective way so, measurement practices in agile metho...

  18. Measurement of company effectiveness using analytic network process method

    Directory of Open Access Journals (Sweden)

    Goran Janjić

    2017-07-01

    Full Text Available The sustainable development of an organisation is monitored through the organisation’s performance, which beforehand incorporates all stakeholders’ requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP to define the weight factors of the mutual influences of all the important elements of an organisation’s strategy. The calculation of an organisation’s effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation’s business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation’s most important measures.

  19. In aid of in-core measurement processing

    International Nuclear Information System (INIS)

    Makai, M.

    1986-04-01

    The core of a WWER-440 reactor is furnished with 36 self-powered neutron detector (SPND) sets consisting of 7 detectors located on 7 floors each. The axial power is constructed from these SPND readings. Further 210 assemblies are equipped with outlet temperature measurements. The measurement data processing aims not only to assign a 'reading' to the non-measured assemblies but also to assign an error to the measurements. Experimental programs of measurement processing were elaborated which rely on a number of trial functions derived in the framework of the traditional calculation model of WWER-440. Alternatives suitable for on-line measurement processing are outlined along with the test results. (author)

  20. Piagetian Cognitive Development and Primary Process Thinking in Children

    Science.gov (United States)

    Wulach, James S.

    1977-01-01

    Thirty-seven middle-class white children, ages 5-8, were tested on eight Piagetian tasks and the Rorschach test, and divided into preoperational, transitional, and concrete operational groups. Measures of primary process vs. secondary process thinking were found to be related to the Piagetian stages of development. (GDC)

  1. Analysis of Uncertainty in Dynamic Processes Development of Banks Functioning

    Directory of Open Access Journals (Sweden)

    Aleksei V. Korovyakovskii

    2013-01-01

    Full Text Available The paper offers the approach to measure of uncertainty estimation in dynamic processes of banks functioning, using statistic data of different banking operations indicators. To calculate measure of uncertainty in dynamic processes of banks functioning the phase images of relevant sets of statistic data are considered. Besides, it is shown that the form of phase image of the studied sets of statistic data can act as a basis of measure of uncertainty estimation in dynamic processes of banks functioning. The set of analytical characteristics are offered to formalize the form of phase image definition of the studied sets of statistic data. It is shown that the offered analytical characteristics consider inequality of changes in values of the studied sets of statistic data, which is one of the ways of uncertainty display in dynamic processes development. The invariant estimates of measure of uncertainty in dynamic processes of banks functioning, considering significant changes in absolute values of the same indicators for different banks were obtained. The examples of calculation of measure of uncertainty in dynamic processes of concrete banks functioning were cited.

  2. Development of Spectrophotometric Process Monitors for Aqueous Reprocessing Facilities

    International Nuclear Information System (INIS)

    Smith, N.; Krebs, J.; Hebden, A.

    2015-01-01

    The safeguards envelope of an aqueous reprocessing plant can be extended beyond traditional measures to include surveillance of the process chemistry itself. By observing the concentration of accountable species in solution directly, a measure of real time accountancy can be applied. Of equal importance, select information on the process chemistry can be determined that will allow the operator and inspectors to verify that the process is operating as intended. One of the process monitors that can be incorporated is molecular spectroscopy, such as UV-Visible absorption spectroscopy. Argonne National Laboratory has developed a process monitoring system that can be tailored to meet the specific chemistry requirements of a variety of processes. The Argonne Spectroscopic Process monitoring system (ASP) is composed of commercial-off-the-shelf (COTS) spectroscopic hardware, custom manufactured sample handling components (to meet end user requirements) and the custom Plutonium and Uranium Measurement and Acquisition System (PUMAS) software. Two versions of the system have been deployed at the Savannah River Site's H-Canyon facility, tailored for high and low concentration streams. (author)

  3. Development and Performance of a Highly Sensitive Model Formulation Based on Torasemide to Enhance Hot-Melt Extrusion Process Understanding and Process Development.

    Science.gov (United States)

    Evans, Rachel C; Kyeremateng, Samuel O; Asmus, Lutz; Degenhardt, Matthias; Rosenberg, Joerg; Wagner, Karl G

    2018-02-27

    The aim of this work was to investigate the use of torasemide as a highly sensitive indicator substance and to develop a formulation thereof for establishing quantitative relationships between hot-melt extrusion process conditions and critical quality attributes (CQAs). Using solid-state characterization techniques and a 10 mm lab-scale co-rotating twin-screw extruder, we studied torasemide in a Soluplus® (SOL)-polyethylene glycol 1500 (PEG 1500) matrix, and developed and characterized a formulation which was used as a process indicator to study thermal- and hydrolysis-induced degradation, as well as residual crystallinity. We found that torasemide first dissolved into the matrix and then degraded. Based on this mechanism, extrudates with measurable levels of degradation and residual crystallinity were produced, depending strongly on the main barrel and die temperature and residence time applied. In addition, we found that 10% w/w PEG 1500 as plasticizer resulted in the widest operating space with the widest range of measurable residual crystallinity and degradant levels. Torasemide as an indicator substance behaves like a challenging-to-process API, only with higher sensitivity and more pronounced effects, e.g., degradation and residual crystallinity. Application of a model formulation containing torasemide will enhance the understanding of the dynamic environment inside an extruder and elucidate the cumulative thermal and hydrolysis effects of the extrusion process. The use of such a formulation will also facilitate rational process development and scaling by establishing clear links between process conditions and CQAs.

  4. Hybrid scatterometry measurement for BEOL process control

    Science.gov (United States)

    Timoney, Padraig; Vaid, Alok; Kang, Byeong Cheol; Liu, Haibo; Isbester, Paul; Cheng, Marjorie; Ng-Emans, Susan; Yellai, Naren; Sendelbach, Matt; Koret, Roy; Gedalia, Oram

    2017-03-01

    Scaling of interconnect design rules in advanced nodes has been accompanied by a reducing metrology budget for BEOL process control. Traditional inline optical metrology measurements of BEOL processes rely on 1-dimensional (1D) film pads to characterize film thickness. Such pads are designed on the assumption that solid copper blocks from previous metallization layers prevent any light from penetrating through the copper, thus simplifying the effective film stack for the 1D optical model. However, the reduction of the copper thickness in each metallization layer and CMP dishing effects within the pad, have introduced undesired noise in the measurement. To resolve this challenge and to measure structures that are more representative of product, scatterometry has been proposed as an alternative measurement. Scatterometry is a diffraction based optical measurement technique using Rigorous Coupled Wave Analysis (RCWA), where light diffracted from a periodic structure is used to characterize the profile. Scatterometry measurements on 3D structures have been shown to demonstrate strong correlation to electrical resistance parameters for BEOL Etch and CMP processes. However, there is significant modeling complexity in such 3D scatterometry models, in particlar due to complexity of front-end-of-line (FEOL) and middle-of-line (MOL) structures. The accompanying measurement noise associated with such structures can contribute significant measurement error. To address the measurement noise of the 3D structures and the impact of incoming process variation, a hybrid scatterometry technique is proposed that utilizes key information from the structure to significantly reduce the measurement uncertainty of the scatterometry measurement. Hybrid metrology combines measurements from two or more metrology techniques to enable or improve the measurement of a critical parameter. In this work, the hybrid scatterometry technique is evaluated for 7nm and 14nm node BEOL measurements of

  5. In-reactor fuel cladding external corrosion measurement process and results

    International Nuclear Information System (INIS)

    Thomazet, J.; Musante, Y.; Pigelet, J.

    1999-01-01

    Analysis of the zirconium alloy cladding behaviour calls for an on-site corrosion measurement device. In the 80's, a FISCHER probe was used and allowed oxide layer measurements to be taken along the outer generating lines of the peripheral fuel rods. In order to allow measurements on inner rods, a thin Eddy current probe called SABRE was developed by FRAMATOME. The SABRE is a blade equipped with two E.C coils is moved through the assembly rows. A spring allows the measurement coil to be clamped on each of the generating lines of the scanned rods. By inserting this blade on all four assembly faces, measurements can also be performed along several generating lines of the same rod. Standard rings are fitted on the device and allow on-line calibration for each measured row. Signal acquisition and processing are performed by LAGOS, a dedicated software program developed by FRAMATOME. The measurements are generally taken at the cycle outage, in the spent fuel pool. On average, data acquisition calls for one shift per assembly (eight hours): this corresponds to more than 2500 measurement points. These measurements are processed statistically by the utility program SAN REMO. All the results are collected in a database for subsequent behaviour analysis: examples of investigated parameters are the thermal/hydraulic conditions of the reactors, the irradiation history, the cladding material, the water chemistry This analysis can be made easier by comparing the behaviour measurement and prediction by means of the COROS-2 corrosion code. (author)

  6. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  7. Patient-Reported Outcome (PRO) Consortium translation process: consensus development of updated best practices.

    Science.gov (United States)

    Eremenco, Sonya; Pease, Sheryl; Mann, Sarah; Berry, Pamela

    2017-01-01

    This paper describes the rationale and goals of the Patient-Reported Outcome (PRO) Consortium's instrument translation process. The PRO Consortium has developed a number of novel PRO measures which are in the process of qualification by the U.S. Food and Drug Administration (FDA) for use in clinical trials where endpoints based on these measures would support product labeling claims. Given the importance of FDA qualification of these measures, the PRO Consortium's Process Subcommittee determined that a detailed linguistic validation (LV) process was necessary to ensure that all translations of Consortium-developed PRO measures are performed using a standardized approach with the rigor required to meet regulatory and pharmaceutical industry expectations, as well as having a clearly defined instrument translation process that the translation industry can support. The consensus process involved gathering information about current best practices from 13 translation companies with expertise in LV, consolidating the findings to generate a proposed process, and obtaining iterative feedback from the translation companies and PRO Consortium member firms on the proposed process in two rounds of review in order to update existing principles of good practice in LV and to provide sufficient detail for the translation process to ensure consistency across PRO Consortium measures, sponsors, and translation companies. The consensus development resulted in a 12-step process that outlines universal and country-specific new translation approaches, as well as country-specific adaptations of existing translations. The PRO Consortium translation process will play an important role in maintaining the validity of the data generated through these measures by ensuring that they are translated by qualified linguists following a standardized and rigorous process that reflects best practice.

  8. Technology development for DUPIC process safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Hong, J S; Kim, H D; Lee, Y G; Kang, H Y; Cha, H R; Byeon, K H; Park, Y S; Choi, H N [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-07-01

    As the strategy for DUPIC(Direct Use of spent PWR fuel In CANDU reactor) process safeguards, the neutron detection method was introduced to account for nuclear materials in the whole DUPIC process by selectively measuring spontaneous fission neutron signals from {sup 244}Cm. DSNC was designed and manufactured to measure the account of curium in the fuel bundle and associated process samples in the DUPIC fuel cycle. The MCNP code had response profile along the length of the CANDU type fuel bundle. It was found experimentally that the output signal variation due to the overall azimuthal asymmetry was less than 0.2%. The longitudinal detection efficiency distribution at every position including both ends was kept less than 2% from the average value. Spent fuel standards almost similar to DUPIC process material were fabricated from a single spent PWR fuel rod and the performance verification of the DSNC is in progress under very high radiation environment. The results of this test will be eventually benchmarked with other sources such as code simulation, chemical analysis and gamma analysis. COREMAS-DUPIC has been developed for the accountability management of nuclear materials treated by DUPIC facility. This system is able to track the controlled nuclear materials maintaining the material inventory in near-real time and to generate the required material accountability records and reports. Concerning the containment and surveillance technology, a focused R and D effort is given to the development of unattended continuous monitoring system. Currently, the component technologies of radiation monitoring and surveillance have been established, and continued R and D efforts are given to the integration of the components into automatic safeguards diagnostics. (author).

  9. An official American thoracic society workshop report: developing performance measures from clinical practice guidelines.

    Science.gov (United States)

    Kahn, Jeremy M; Gould, Michael K; Krishnan, Jerry A; Wilson, Kevin C; Au, David H; Cooke, Colin R; Douglas, Ivor S; Feemster, Laura C; Mularski, Richard A; Slatore, Christopher G; Wiener, Renda Soylemez

    2014-05-01

    Many health care performance measures are either not based on high-quality clinical evidence or not tightly linked to patient-centered outcomes, limiting their usefulness in quality improvement. In this report we summarize the proceedings of an American Thoracic Society workshop convened to address this problem by reviewing current approaches to performance measure development and creating a framework for developing high-quality performance measures by basing them directly on recommendations from well-constructed clinical practice guidelines. Workshop participants concluded that ideally performance measures addressing care processes should be linked to clinical practice guidelines that explicitly rate the quality of evidence and the strength of recommendations, such as the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) process. Under this framework, process-based performance measures would only be developed from strong recommendations based on high- or moderate-quality evidence. This approach would help ensure that clinical processes specified in performance measures are both of clear benefit to patients and supported by strong evidence. Although this approach may result in fewer performance measures, it would substantially increase the likelihood that quality-improvement programs based on these measures actually improve patient care.

  10. On-line measurement and control in sustainable mineral processing and energy production

    International Nuclear Information System (INIS)

    Sowerby, B.D.

    2002-01-01

    Sustainable development can be defined as development that 'meets the needs of the present without compromising the ability of future generations to meet their own needs' (WCED, 1987). A sustainable minerals and energy industry will need to achieve a number of related objectives including greater energy efficiency, improved utilisation of ore deposits, improved utilisation of existing plant, improved product quality, reduction of waste material, reduction of pollution levels and improved safety margins. These objectives all relate in varying degrees to the triple bottom line of economic, social and environmental benefits. One critical component in achieving these objectives is to develop and apply improved control systems across the full range of industry applications from mining to processing and utilisation. However process control relies heavily on the availability of suitable on-line process instrumentation to provide the data and feedback necessary for its implementation. There is a lot of truth in the saying 'if you can't measure it you can't control it'. In the past measurement was achieved by manual sampling followed by sample preparation (such as drying, mixing, crushing and dividing) and off-line laboratory analysis. However this procedure is often subject to significant sampling errors and, most importantly, the measurements are too slow for control purposes. By contrast, on-line analysis can provide rapid and accurate measurement in real time thus opening up new possibilities for improved process control. As a result, there has been a rapid increase in the industrial application of on-line analysis instrumentation over the past few decades. The main purpose of this paper is to briefly review some past Australian developments of on-line analysis systems in the mineral and coal industries and to discuss present developments and future trends

  11. Meanings, mechanisms, and measures of holistic processing.

    Science.gov (United States)

    Richler, Jennifer J; Palmeri, Thomas J; Gauthier, Isabel

    2012-01-01

    Few concepts are more central to the study of face recognition than holistic processing. Progress toward understanding holistic processing is challenging because the term "holistic" has many meanings, with different researchers addressing different mechanisms and favoring different measures. While in principle the use of different measures should provide converging evidence for a common theoretical construct, convergence has been slow to emerge. We explore why this is the case. One challenge is that "holistic processing" is often used to describe both a theoretical construct and a measured effect, which may not have a one-to-one mapping. Progress requires more than greater precision in terminology regarding different measures of holistic processing or different hypothesized mechanisms of holistic processing. Researchers also need to be explicit about what meaning of holistic processing they are investigating so that it is clear whether different researchers are describing the same phenomenon or not. Face recognition differs from object recognition, and not all meanings of holistic processing are equally suited to help us understand that important difference.

  12. Ultrasonic velocity measurements- a potential sensor for intelligent processing of austenitic stainless steels

    International Nuclear Information System (INIS)

    Venkadesan, S.; Palanichamy, P.; Vasudevan, M.; Baldev Raj

    1996-01-01

    Development of sensors based on Non-Destructive Evaluation (NDE) techniques for on-line sensing of microstructure and properties requires a thorough knowledge on the relation between the sensing mechanism/measurement of an NDE technique and the microstructure. As a first step towards developing an on-line sensor for studying the dynamic microstructural changes during processing of austenitic stainless steels, ultrasonic velocity measurements have been carried out to study the microstructural changes after processing. Velocity measurements could follow the progress of annealing starting from recovery, onset and completion of recrystallization, sense the differences in the microstructure obtained after hot deformation and estimate the grain size. This paper brings out the relation between the sensing method based on ultrasonic velocity measurements and the microstructure in austenitic stainless steel. (author)

  13. Designing a clinical audit tool to measure processes of pregnancy care

    Directory of Open Access Journals (Sweden)

    Wallace EM

    2011-12-01

    Full Text Available Suzanne V Sinni1, Wendy M Cross2, Euan M Wallace1,31Department of Obstetrics and Gynaecology, Monash University and Southern Health, Monash Medical Centre, Clayton, Victoria, 2School of Nursing and Midwifery, Monash University, Clayton, Victoria, 3The Ritchie Centre, Monash Institute of Medical Research, Monash University, Clayton, Victoria, AustraliaAbstract: This paper reports the development of a clinical audit tool as part of a larger project to evaluate a new maternity service, underpinned by a patient safety framework.Aim: The aim of this work is to describe the development of a clinical audit tool that measures the process of pregnancy care, and its application.Background: There are many reports about outcomes of healthcare provision, however there are limited studies examining the process of care. There is also limited evidence linking clinical audit with improvements in care delivery. Pregnancy care was chosen because there are well defined and agreed clinical standards against which to measure the delivery of pregnancy care. A clinical audit using these standards addresses both gaps in the literature.Methods: Standard methodological processes were used to develop the audit tool. Literature informed the processes. Data were collected in 2009–2010 using the tool described in the paper. Reliability testing was completed in September 2011.Results: An audit tool to measure pregnancy care was developed and applied to 354 health records to enable analysis of adherence to organizational expectations of care. Reliability testing of the tool achieved an overall kappa of 0.896.Conclusion: Developing an audit tool based on processes described in the literature is labor intensive and resource dependent, however it results in a robust, reliable, valid tool that can be used in diverse maternity services. Stakeholder participation from the outset ensures ongoing engagement for the duration of a clinically based project spanning several years

  14. Machine vision for high-precision volume measurement applied to levitated containerless material processing

    International Nuclear Information System (INIS)

    Bradshaw, R.C.; Schmidt, D.P.; Rogers, J.R.; Kelton, K.F.; Hyers, R.W.

    2005-01-01

    By combining the best practices in optical dilatometry with numerical methods, a high-speed and high-precision technique has been developed to measure the volume of levitated, containerlessly processed samples with subpixel resolution. Containerless processing provides the ability to study highly reactive materials without the possibility of contamination affecting thermophysical properties. Levitation is a common technique used to isolate a sample as it is being processed. Noncontact optical measurement of thermophysical properties is very important as traditional measuring methods cannot be used. Modern, digitally recorded images require advanced numerical routines to recover the subpixel locations of sample edges and, in turn, produce high-precision measurements

  15. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  16. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  17. The evolution and development of an instrument to measure essential professional nursing practices.

    Science.gov (United States)

    Kramer, Marlene; Brewer, Barbara B; Halfer, Diana; Hnatiuk, Cynthia Nowicki; MacPhee, Maura; Schmalenberg, Claudia

    2014-11-01

    Nursing continues to evolve from a task-oriented occupation to a holistic professional practice. Increased professionalism requires accurate measurement of care processes and practice. Nursing studies often omit measurement of the relationship between structures in the work environment and processes of care or between processes of care and patient outcomes. Process measurement is integral to understanding and improving nursing practice. This article describes the development of an updated Essentials of Magnetism process measurement instrument for clinical nurses (CNs) practicing on inpatient units in hospitals. It has been renamed Essential Professional Nursing Practices: CN.

  18. Metrology and process control: dealing with measurement uncertainty

    Science.gov (United States)

    Potzick, James

    2010-03-01

    Metrology is often used in designing and controlling manufacturing processes. A product sample is processed, some relevant property is measured, and the process adjusted to bring the next processed sample closer to its specification. This feedback loop can be remarkably effective for the complex processes used in semiconductor manufacturing, but there is some risk involved because measurements have uncertainty and product specifications have tolerances. There is finite risk that good product will fail testing or that faulty product will pass. Standard methods for quantifying measurement uncertainty have been presented, but the question arises: how much measurement uncertainty is tolerable in a specific case? Or, How does measurement uncertainty relate to manufacturing risk? This paper looks at some of the components inside this process control feedback loop and describes methods to answer these questions.

  19. An in-process form error measurement system for precision machining

    International Nuclear Information System (INIS)

    Gao, Y; Huang, X; Zhang, Y

    2010-01-01

    In-process form error measurement for precision machining is studied. Due to two key problems, opaque barrier and vibration, the study of in-process form error optical measurement for precision machining has been a hard topic and so far very few existing research works can be found. In this project, an in-process form error measurement device is proposed to deal with the two key problems. Based on our existing studies, a prototype system has been developed. It is the first one of the kind that overcomes the two key problems. The prototype is based on a single laser sensor design of 50 nm resolution together with two techniques, a damping technique and a moving average technique, proposed for use with the device. The proposed damping technique is able to improve vibration attenuation by up to 21 times compared to the case of natural attenuation. The proposed moving average technique is able to reduce errors by seven to ten times without distortion to the form profile results. The two proposed techniques are simple but they are especially useful for the proposed device. For a workpiece sample, the measurement result under coolant condition is only 2.5% larger compared with the one under no coolant condition. For a certified Wyko test sample, the overall system measurement error can be as low as 0.3 µm. The measurement repeatability error can be as low as 2.2%. The experimental results give confidence in using the proposed in-process form error measurement device. For better results, further improvement in design and tests are necessary

  20. FY 1999 Report on research and development results of photon-applied instrumentation/processing technologies. Research and development of advanced measuring/processing technologies for oil production systems; 1999 nendo foton keisoku kako gijutsu seika hokokusho. Sekiyu seisan system kodo keisoku kako gijutsu kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Described herein are the FY 1999 results of the research and development of photon (laser) beam utilization as part of the R and D project of the advanced measuring/processing technologies for oil production systems. For the high-reliability laser welding technology, the tests are conducted for welding 15 mm thick steel plates and 5 mm thick aluminum alloy plates by synthesized iodine/YAG laser beams, producing high-quality welding results. For the microscopic processing technology, attempts have been made for development of quantum functional optoelectronic devices which have nanometer-sized ultrafine dots. For the non-destructive composition measuring technology, the internal transmission measurement program produces the target light quantity by increasing brightness of the short-wavelength light source. The three-dimensional digital tomography (DT) images with a space resolution of several micrometers are obtained. For the tightly-focusing all-solid-state laser technology, a fiber-structured fiber laser is developed, on a trial basis, to attain a power of 15 W. A high-power, high-brightness laser diode, required for exciting the fiber laser is developed, and a power of 30 W or more is obtained by an InGa(As)P device. The comprehensive investigation results are also presented. (NEDO)

  1. Developing Human Performance Measures (PSAM8)

    International Nuclear Information System (INIS)

    Jeffrey C. Joe

    2006-01-01

    Through the reactor oversight process (ROP), the U.S. Nuclear Regulatory Commission (NRC) monitors the performance of utilities licensed to operate nuclear power plants. The process is designed to assure public health and safety by providing reasonable assurance that licensees are meeting the cornerstones of safety and designated crosscutting elements. The reactor inspection program, together with performance indicators (PIs), and enforcement activities form the basis for the NRC's risk-informed, performance based regulatory framework. While human performance is a key component in the safe operation of nuclear power plants and is a designated cross-cutting element of the ROP, there is currently no direct inspection or performance indicator for assessing human performance. Rather, when human performance is identified as a substantive cross cutting element in any 1 of 3 categories (resources, organizational or personnel), it is then evaluated for common themes to determine if follow-up actions are warranted. However, variability in human performance occurs from day to day, across activities that vary in complexity, and workgroups, contributing to the uncertainty in the outcomes of performance. While some variability in human performance may be random, much of the variability may be attributed to factors that are not currently assessed. There is a need to identify and assess aspects of human performance that relate to plant safety and to develop measures that can be used to successfully assure licensee performance and indicate when additional investigation may be required. This paper presents research that establishes a technical basis for developing human performance measures. In particular, we discuss: (1) how historical data already gives some indication of connection between human performance and overall plant performance, (2) how industry led efforts to measure and model human performance and organizational factors could serve as a data source and basis for a

  2. From mission to measures: performance measure development for a Teen Pregnancy Prevention Program.

    Science.gov (United States)

    Farb, Amy Feldman; Burrus, Barri; Wallace, Ina F; Wilson, Ellen K; Peele, John E

    2014-03-01

    The Office of Adolescent Health (OAH) sought to create a comprehensive set of performance measures to capture the performance of the Teen Pregnancy Prevention (TPP) program. This performance measurement system needed to provide measures that could be used internally (by both OAH and the TPP grantees) for management and program improvement as well as externally to communicate the program's progress to other interested stakeholders and Congress. This article describes the selected measures and outlines the considerations behind the TPP measurement development process. Issues faced, challenges encountered, and lessons learned have broad applicability for other federal agencies and, specifically, for TPP programs interested in assessing their own performance and progress. Published by Elsevier Inc.

  3. The Development of NOAA Education Common Outcome Performance Measures (Invited)

    Science.gov (United States)

    Baek, J.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA) Education Council has embarked on an ambitious Monitoring and Evaluation (M&E) project that will allow it to assess education program outcomes and impacts across the agency, line offices, and programs. The purpose of this internal effort is to link outcome measures to program efforts and to evaluate the success of the agency's education programs in meeting the strategic goals. Using an outcome-based evaluation approach, the NOAA Education Council is developing two sets of common outcome performance measures, environmental stewardship and professional development. This presentation will examine the benefits and tradeoffs of common outcome performance measures that collect program results across a portfolio of education programs focused on common outcomes. Common outcome performance measures have a few benefits to our agency and to the climate education field at large. The primary benefit is shared understanding, which comes from our process for writing common outcome performance measures. Without a shared and agreed upon set of definitions for the measure of an outcome, the reported results may not be measuring the same things and would incorrectly indicate levels of performance. Therefore, our writing process relies on a commitment to developing a shared set of definitions based on consensus. We hope that by taking the time to debate and coming to agreement across a diverse set of programs, the strength of our common measures can indicate real progress towards outcomes we care about. An additional benefit is that these common measures can be adopted and adapted by other agencies and organizations that share similar theories of change. The measures are not without their drawbacks, and we do make tradeoffs as part of our process in order to continue making progress. We know that any measure is necessarily a narrow slice of performance. A slice that may not best represent the unique and remarkable contribution

  4. MULTI-DIMENSIONAL MEASURE OF STRATEGY DEVELOPMENT PROCESS FROM A DIFFERENT CONTEXT: AN EMPIRICAL RESEARCH ON TURKISH MANAGERS

    OpenAIRE

    Ozleblebici, Zafer

    2015-01-01

    The purpose of this paper is to provide another explanation of strategy development process from a different context. Even though several studies exploring different approaches to strategy development process have been employed, most of them have examined organizations in similar samplings, more specifically Anglo-Saxon cultures/countries. Therefore, in order to explore the strategy development approaches from a different context, the paper aims to expose and describe the strategy development...

  5. A whole process quality control system for energy measuring instruments inspection based on IOT technology

    Science.gov (United States)

    Yin, Bo; Liu, Li; Wang, Jiahan; Li, Xiran; Liu, Zhenbo; Li, Dewei; Wang, Jun; Liu, Lu; Wu, Jun; Xu, Tingting; Cui, He

    2017-10-01

    Electric energy measurement as a basic work, an accurate measurements play a vital role for the economic interests of both parties of power supply, the standardized management of the measurement laboratory at all levels is a direct factor that directly affects the fairness of measurement. Currently, the management of metering laboratories generally uses one-dimensional bar code as the recognition object, advances the testing process by manual management, most of the test data requires human input to generate reports. There are many problems and potential risks in this process: Data cannot be saved completely, cannot trace the status of inspection, the inspection process isn't completely controllable and so on. For the provincial metrology center's actual requirements of the whole process management for the performance test of the power measuring appliances, using of large-capacity RF tags as a process management information media, we developed a set of general measurement experiment management system, formulated a standardized full performance test process, improved the raw data recording mode of experimental process, developed a storehouse automatic inventory device, established a strict test sample transfer and storage system, ensured that all the raw data of the inspection can be traced back, achieved full life-cycle control of the sample, significantly improved the quality control level and the effectiveness of inspection work.

  6. Chemical process measurements in PWR-type nuclear power plants

    International Nuclear Information System (INIS)

    Glaeser, E.

    1978-01-01

    In order to achieve high levels of availability of nuclear power plants equipped with pressurized water reactors, strict standards have to be applied to the purity of coolant and of other media. Chemical process measurements can meet these requirements only if programmes are established giving maximum information with minimum expenditure and if these programmes are realized with effective analytical methods. Analysis programmes known from literature are proved for their usefulness, and hints are given for establishing rational programmes. Analytical techniques are compared with each other taking into consideration both methods which have already been introduced into nuclear power plant practice and methods not yet generally used in practice, such as atomic absorption spectrophotometry, gas chromatography, etc. Finally, based on the state of the art of chemical process measurements in nuclear power plants, the trends of future development are pointed out. (author)

  7. SPECTRAN - a highly sensitive process photometer for selective measurements of gases and liquids in environment and process technology

    International Nuclear Information System (INIS)

    Breton, H.; Krieg, G.

    1984-01-01

    The SPECTRAN process photometer uses the wavelength-dependent attenuation of optical radiation for the selective measurement of molecular compounds in gases and liquids. The system which originally has been designed for UF 6 measurements has been developed to serve various applications, as e.g. in chemical and thermal engineering, for monitoring measurements of emissions and MAC, explosion protection, purity measurements, in environmental and bioengineering, nuclear and energy technology, pharmaceutical and medical engineering, as well as in the food industries. (DG) [de

  8. Locally specific measures for employment aimed at regional development

    Directory of Open Access Journals (Sweden)

    Vladimir Cini

    2013-12-01

    Full Text Available The oldest and largest sub-region in the world functioning on the principle of economic union is the European Union. The creation of a single market has initiated the process of conditional adjustment of markets in the EU member states, which has a significant impact on the social welfare of its citizens. It is necessary to tackle this issue by joint efforts within the European Union. As globalization processes push for economic integration and development of competitive advantage, the regions will have to make some challenging adjustments. The development tends to concentrate in highly competitive regions, while regions in the periphery lag behind. However, this pertains not only to the economic lag, but also to a potential negative political situation. Locally specific active employment policy measures are a continuation of the effort to make these measures more flexible. They refer to the Joint Assessment of Employment Policy Priorities and the IPA Human Resources Development Operational Programme - a regional policy instrument of the European Union. Both documents highlight the issue of disproportional development of regions, which requires special local measures and active labour market policy programmes. To reduce regional differences in development, it is necessary to invest more resources in the regions that lag behind. In this particular case, this means the counties in Croatia with high unemployment rates, a large number of registered unemployed persons and low employment rate. Consequently, this paper explains the importance of the adoption of locally specific measures for employment, which unfortunately did not take hold in the Republic of Croatia, and highlights the need for further decentralization of public services, with the aim of balancing regional development

  9. Measuring process performance within healthcare logistics - a decision tool for selecting track and trace technologies

    DEFF Research Database (Denmark)

    Feibert, Diana Cordes; Jacobsen, Peter

    2015-01-01

    quality of work. Data validity is essential for enabling performance measurement, and selecting the right technologies is important to achieve this. A case study of the hospital cleaning process was conducted at a public Danish hospital to develop a framework for assessing technologies in healthcare......Monitoring tasks and ascertaining quality of work is difficult in a logistical healthcare process due to cleaning personnel being dispersed throughout the hospital. Performance measurement can support the organization in improving the efficiency and effectiveness of processes and in ensuring...... logistics. A set of decision indicators was identified in the case study to assess technologies based on expected process performance. Two aspects of performance measurement were investigated for the hospital cleaning process: what to measure and how to measure it....

  10. Two-stage process analysis using the process-based performance measurement framework and business process simulation

    NARCIS (Netherlands)

    Han, K.H.; Kang, J.G.; Song, M.S.

    2009-01-01

    Many enterprises have recently been pursuing process innovation or improvement to attain their performance goals. To align a business process with enterprise performances, this study proposes a two-stage process analysis for process (re)design that combines the process-based performance measurement

  11. Integrated durability process in product development

    International Nuclear Information System (INIS)

    Pompetzki, M.; Saadetian, H.

    2002-01-01

    This presentation describes the integrated durability process in product development. Each of the major components of the integrated process are described along with a number of examples of how integrated durability assessment has been used in the ground vehicle industry. The durability process starts with the acquisition of loading information, either physically through loads measurement or virtually through multibody dynamics. The loading information is then processed and characterized for further analysis. Durability assessment was historically test based and completed through field or laboratory evaluation. Today, it is common that both the test and CAE environments are used together in durability assessment. Test based durability assessment is used for final design sign-off but is also critically important for correlating CAE models, in order to investigate design alternatives. There is also a major initiative today to integrate the individual components into a process, by linking applications and providing a framework to communicate information as well as manage all the data involved in the entire process. Although a single process is presented, the details of the process can vary significantly for different products and applications. Recent applications that highlight different parts of the durability process are given. As well as an example of how integration of software tools between different disciplines (MBD, FE and fatigue) not only simplifies the process, but also significantly improves it. (author)

  12. Linguistic measures of the referential process in psychodynamic treatment: the English and Italian versions.

    Science.gov (United States)

    Mariani, Rachele; Maskit, Bernard; Bucci, Wilma; De Coro, Alessandra

    2013-01-01

    The referential process is defined in the context of Bucci's multiple code theory as the process by which nonverbal experience is connected to language. The English computerized measures of the referential process, which have been applied in psychotherapy research, include the Weighted Referential Activity Dictionary (WRAD), and measures of Reflection, Affect and Disfluency. This paper presents the development of the Italian version of the IWRAD by modeling Italian texts scored by judges, and shows the application of the IWRAD and other Italian measures in three psychodynamic treatments evaluated for personality change using the Shedler-Westen Assessment Procedure (SWAP-200). Clinical predictions based on applications of the English measures were supported.

  13. Digital signal processing for velocity measurements in dynamical material's behaviour studies

    International Nuclear Information System (INIS)

    Devlaminck, Julien; Luc, Jerome; Chanal, Pierre-Yves

    2014-01-01

    In this work, we describe different configurations of optical fiber interferometers (types Michelson and Mach-Zehnder) used to measure velocities during dynamical material's behaviour studies. We detail the algorithms of processing developed and optimized to improve the performance of these interferometers especially in terms of time and frequency resolutions. Three methods of analysis of interferometric signals were studied. For Michelson interferometers, the time-frequency analysis of signals by Short-Time Fourier Transform (STFT) is compared to a time-frequency analysis by Continuous Wavelet Transform (CWT). The results have shown that the CWT was more suitable than the STFT for signals with low signal-to-noise, and low velocity and high acceleration areas. For Mach- Zehnder interferometers, the measurement is carried out by analyzing the phase shift between three interferometric signals (Triature processing). These three methods of digital signal processing were evaluated, their measurement uncertainties estimated, and their restrictions or operational limitations specified from experimental results performed on a pulsed power machine. (authors)

  14. Instrument maintenance of ultrasonic influences parameters measurement in technological processes

    Directory of Open Access Journals (Sweden)

    Tomal V. S.

    2008-04-01

    Full Text Available The contact and non-contact vibration meters for intermittent and continuous control of the vibration amplitude in the ultrasonic technological equipment have been developed. And in order to estimate the cavitation intensity in liquids the authors have developed cavitation activity indicators and cavitation sensitivity meters, allowing to measure the magnitude of the signal level in the range of maximum spectral density of cavitation noise. The developed instruments allow to improve the quality of products, reduce the defect rate and power consumption of equipment by maintaining optimum conditions of the process.

  15. The processes of strategy development

    OpenAIRE

    Bailey, Andy; Johnson, Gerry

    1995-01-01

    This paper is concerned with the processes by which strategy is developed within organisations. It builds on research into the nature of strategy development being undertaken within the Centre for Strategic Management and Organisational Change at Cranfield School of Management. Initially the process of strategy development is discussed, a number of explanations of the process are presented and an integrated framework is developed. This framework is subsequently used to illustra...

  16. Developing Elementary Math and Science Process Skills Through Engineering Design Instruction

    Science.gov (United States)

    Strong, Matthew G.

    This paper examines how elementary students can develop math and science process skills through an engineering design approach to instruction. The performance and development of individual process skills overall and by gender were also examined. The study, preceded by a pilot, took place in a grade four extracurricular engineering design program in a public, suburban school district. Students worked in pairs and small groups to design and construct airplane models from styrofoam, paper clips, and toothpicks. The development and performance of process skills were assessed through a student survey of learning gains, an engineering design packet rubric (student work), observation field notes, and focus group notes. The results indicate that students can significantly develop process skills, that female students may develop process skills through engineering design better than male students, and that engineering design is most helpful for developing the measuring, suggesting improvements, and observing process skills. The study suggests that a more regular engineering design program or curriculum could be beneficial for students' math and science abilities both in this school and for the elementary field as a whole.

  17. A content review of cognitive process measures used in pain research within adult populations.

    Science.gov (United States)

    Day, M A; Lang, C P; Newton-John, T R O; Ehde, D M; Jensen, M P

    2017-01-01

    Previous research suggests that measures of cognitive process may be confounded by the inclusion of items that also assess cognitive content. The primary aims of this content review were to: (1) identify the domains of cognitive processes assessed by measures used in pain research; and (2) determine if pain-specific cognitive process measures with adequate psychometric properties exist. PsychInfo, CINAHL, PsycArticles, MEDLINE, and Academic Search Complete databases were searched to identify the measures of cognitive process used in pain research. Identified measures were double coded and the measure's items were rated as: (1) cognitive content; (2) cognitive process; (3) behavioural/social; and/or (4) emotional coping/responses to pain. A total of 319 scales were identified; of these, 29 were coded as providing an un-confounded assessment of cognitive process, and 12 were pain-specific. The cognitive process domains assessed in these measures are Absorption, Dissociation, Reappraisal, Distraction/Suppression, Acceptance, Rumination, Non-Judgment, and Enhancement. Pain-specific, un-confounded measures were identified for: Dissociation, Reappraisal, Distraction/Suppression, and Acceptance. Psychometric properties of all 319 scales are reported in supplementary material. To understand the importance of cognitive processes in influencing pain outcomes as well as explaining the efficacy of pain treatments, valid and pain-specific cognitive process measures that are not confounded with non-process domains (e.g., cognitive content) are needed. The findings of this content review suggest that future research focused on developing cognitive process measures is critical in order to advance our understanding of the mechanisms that underlie effective pain treatment. Many cognitive process measures used in pain research contain a 'mix' of items that assess cognitive process, cognitive content, and behavioural/emotional responses. Databases searched: PsychInfo, CINAHL, Psyc

  18. Material accountancy measurement techniques in dry-powdered processing of nuclear spent fuels

    International Nuclear Information System (INIS)

    Wolf, S. F.

    1999-01-01

    The paper addresses the development of inductively coupled plasma-mass spectrometry (ICPMS), thermal ionization-mass spectrometry (TIMS), alpha-spectrometry, and gamma spectrometry techniques for in-line analysis of highly irradiated (18 to 64 GWD/T) PWR spent fuels in a dry-powdered processing cycle. The dry-powdered technique for direct elemental and isotopic accountancy assay measurements was implemented without the need for separation of the plutonium, uranium and fission product elements in the bulk powdered process. The analyses allow the determination of fuel burn-up based on the isotopic composition of neodymium and/or cesium. An objective of the program is to develop the ICPMS method for direct fissile nuclear materials accountancy in the dry-powdered processing of spent fuel. The ICPMS measurement system may be applied to the KAERI DUPIC (direct use of spent PWR fuel in CANDU reactors) experiment, and in a near-real-time mode for international safeguards verification and non-proliferation policy concerns

  19. Material Control and Accountability Measurements for FB-Line Processes

    International Nuclear Information System (INIS)

    Casella, V.R.

    2002-01-01

    This report provides an overview of FB-Line processes and nuclear material accountability measurements. Flow diagrams for the product, waste, and packaging and stabilization processes are given along with the accountability measurements done before and after each of these processes. Brief descriptions of these measurements are provided. This information provides a better understanding of the general FB-Line processes and how MC and A measurements are used to keep track of the accountable material inventory

  20. Measuring the three process segments of a customer's service experience for an out-patient surgery center.

    Science.gov (United States)

    Wicks, Angela M; Chin, Wynne W

    2008-01-01

    The purpose of this research is to develop an alternative method of measuring out-patient satisfaction where satisfaction is the central construct. The Gap Model operationalized by SERVQUAL is widely used to measure service quality. However, the SERVQUAL instrument only measures expectations (resulting from the pre-process segment of the service experience) and perceptions (resulting from the post-process segment). All three segments should be measured. The lack of proper segmentation and methodological criticisms in the literature motivated this study. A partial least squares (PLS) approach, a form of structural equation modeling, is used to develop a framework to evaluate patient satisfaction in three service process segments: pre-process, process, and post-process service experiences. Results indicate that each process stage mediates subsequent stages, that the process segment is the most important to the patient and that the antecedents have differing impacts on patient satisfaction depending where in the process the antecedent is evaluated. Only one out-patient surgery center was evaluated. Patient satisfaction criteria specific to hospital selection are not included in this study. Results indicate what is important to patients in each service process segment that focus where ambulatory surgery centers should allocate resources. This study is the first to evaluate patient satisfaction with all three process segments.

  1. Ground robotic measurement of aeolian processes

    Science.gov (United States)

    Models of aeolian processes rely on accurate measurements of the rates of sediment transport by wind, and careful evaluation of the environmental controls of these processes. Existing field approaches typically require intensive, event-based experiments involving dense arrays of instruments. These d...

  2. Dry process fuel performance technology development

    International Nuclear Information System (INIS)

    Kang, Kweon Ho; Kim, K. W.; Kim, B. K.

    2006-06-01

    The objective of the project is to establish the performance evaluation system of DUPIC fuel during the Phase III R and D. In order to fulfil this objectives, property model development of DUPIC fuel and irradiation test was carried out in Hanaro using the instrumented rig. Also, the analysis on the in-reactor behavior analysis of DUPIC fuel, out-pile test using simulated DUPIC fuel as well as performance and integrity assessment in a commercial reactor were performed during this Phase. The R and D results of the Phase III are summarized as follows: Fabrication process establishment of simulated DUPIC fuel for property measurement, Property model development for the DUPIC fuel, Performance evaluation of DUPIC fuel via irradiation test in Hanaro, Post irradiation examination of irradiated fuel and performance analysis, Development of DUPIC fuel performance code (KAOS)

  3. Dry process fuel performance technology development

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Kweon Ho; Kim, K. W.; Kim, B. K. (and others)

    2006-06-15

    The objective of the project is to establish the performance evaluation system of DUPIC fuel during the Phase III R and D. In order to fulfil this objectives, property model development of DUPIC fuel and irradiation test was carried out in Hanaro using the instrumented rig. Also, the analysis on the in-reactor behavior analysis of DUPIC fuel, out-pile test using simulated DUPIC fuel as well as performance and integrity assessment in a commercial reactor were performed during this Phase. The R and D results of the Phase III are summarized as follows: Fabrication process establishment of simulated DUPIC fuel for property measurement, Property model development for the DUPIC fuel, Performance evaluation of DUPIC fuel via irradiation test in Hanaro, Post irradiation examination of irradiated fuel and performance analysis, Development of DUPIC fuel performance code (KAOS)

  4. Principles of development of the industry of technogenic waste processing

    Directory of Open Access Journals (Sweden)

    Maria A. Bayeva

    2014-01-01

    Full Text Available Objective to identify and substantiate the principles of development of the industry of technogenic waste processing. Methods systemic analysis and synthesis method of analogy. Results basing on the analysis of the Russian and foreign experience in the field of waste management and environmental protection the basic principles of development activities on technogenic waste processing are formulated the principle of legal regulation the principle of efficiency technologies the principle of ecological safety the principle of economic support. The importance of each principle is substantiated by the description of the situation in this area identifying the main problems and ways of their solution. Scientific novelty the fundamental principles of development of the industry of the industrial wastes processing are revealed the measures of state support are proposed. Practical value the presented theoretical conclusions and proposals are aimed primarily on theoretical and methodological substantiation and practical solutions to modern problems in the sphere of development of the industry of technogenic waste processing.

  5. Measuring Perceptions of Engagement in Teamwork in Youth Development Programs

    Science.gov (United States)

    Cater, Melissa; Jones, Kimberly Y.

    2014-01-01

    The literature regarding teamwork has supported the idea that the key to improving team performance is to understand team processes. Early work within the realm of teamwork focused on quantifiable measures of team performance, like number of products developed. The measure of a successful team hinged on whether or not the team accomplished the end…

  6. Development of Voloxidation Process for Treatment of LWR Spent Fuel

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. J.; Jung, I. H.; Shin, J. M. (and others)

    2007-08-15

    The objective of the project is to develop a process which provides a means to recover fuel from the cladding, and to simplify downstream processes by recovering volatile fission products. This work focuses on the process development in three areas ; the measurement and assessment of the release behavior for the volatile and semi-volatile fission products from the voloxidation process, the assessment of techniques to trap and recover gaseous fission products, and the development of process cycles to optimize fuel cladding separation and fuel particle size. High temperature adsorption method of KAERI was adopted in the co-design of OTS for hot experiment in INL. KAERI supplied 6 sets of filter for hot experiment. Three hot experiment in INL hot cell from the 25th of November for two weeks with attaching 4 KAERI staffs had been carried out. The results were promising. For example, trapping efficiency of Cs was 95% and that of I was 99%, etc.

  7. PSE in Pharmaceutical Process Development

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2011-01-01

    The pharmaceutical industry is under growing pressure to increase efficiency, both in production and in process development. This paper will discuss the use of Process Systems Engineering (PSE) methods in pharmaceutical process development, and searches for answers to questions such as: Which PSE...

  8. Improved process control through real-time measurement of mineral content

    Energy Technology Data Exchange (ETDEWEB)

    Turler, Daniel; Karaca, Murat; Davis, William B.; Giauque, Robert D.; Hopkins, Deborah

    2001-11-02

    In a highly collaborative research and development project with mining and university partners, sensors and data-analysis tools are being developed for rock-mass characterization and real-time measurement of mineral content. Determining mineralogy prior to mucking in an open-pit mine is important for routing the material to the appropriate processing stream. A possible alternative to lab assay of dust and cuttings obtained from drill holes is continuous on-line sampling and real-time x-ray fluorescence (XRF) spectroscopy. Results presented demonstrate that statistical analyses combined with XRF data can be employed to identify minerals and, possibly, different rock types. The objective is to create a detailed three-dimensional mineralogical map in real time that would improve downstream process efficiency.

  9. Development of process diagnostic techniques for piping and equipment

    International Nuclear Information System (INIS)

    Yotsutsuji, Mitoshi

    1987-01-01

    The thing required for using the facilities composing a plant for a long period without anxiety is to quantitatively grasp the quantities of the present condition of the facilities and to take the necessary measures beforehand. For this purpose, the diagnostic techniques for quickly and accurately detect the quantities of the condition of facilities are necessary, and the development of process diagnostic techniques has been desired. The process diagnostic techniques mentioned here mean those for diagnosing the contamination, clogging and performance of towers, tanks, heat exchangers and others. Idemitsu Engineering Co. had developed a simplified diagnostic equipment for detecting the state of fouling in piping in 1982, which is the gamma ray transmission diagnosis named Scale Checker. By further improving it, the process diagnostic techniques for piping and equipment were developed. In this report, the course of development and examination, the principle of detection, the constitution and the examination of remodeling of the Scale Checker are reported. As the cases of process diagnosis in plant facilities, the diagnosis of the clogging in process piping and the diagnosis of the performance of a distillation tower were carried out. The contents of the diagnosis and the results of those cases are explained. (Kako, I.)

  10. Performance Measure as Feedback Variable in Image Processing

    Directory of Open Access Journals (Sweden)

    Ristić Danijela

    2006-01-01

    Full Text Available This paper extends the view of image processing performance measure presenting the use of this measure as an actual value in a feedback structure. The idea behind is that the control loop, which is built in that way, drives the actual feedback value to a given set point. Since the performance measure depends explicitly on the application, the inclusion of feedback structures and choice of appropriate feedback variables are presented on example of optical character recognition in industrial application. Metrics for quantification of performance at different image processing levels are discussed. The issues that those metrics should address from both image processing and control point of view are considered. The performance measures of individual processing algorithms that form a character recognition system are determined with respect to the overall system performance.

  11. The development and application of a coincidence measurement apparatus with micro-computer system

    International Nuclear Information System (INIS)

    Du Hongshan; Zhou Youpu; Gao Junlin; Qin Deming; Cao Yunzheng; Zhao Shiping

    1987-01-01

    A coincidence measurement apparatus with micro-computer system is developed. Automatic data acquisition and processing are achieved. Results of its application for radioactive measurement are satisfactory

  12. Modeling, Measurements, and Fundamental Database Development for Nonequilibrium Hypersonic Aerothermodynamics

    Science.gov (United States)

    Bose, Deepak

    2012-01-01

    The design of entry vehicles requires predictions of aerothermal environment during the hypersonic phase of their flight trajectories. These predictions are made using computational fluid dynamics (CFD) codes that often rely on physics and chemistry models of nonequilibrium processes. The primary processes of interest are gas phase chemistry, internal energy relaxation, electronic excitation, nonequilibrium emission and absorption of radiation, and gas-surface interaction leading to surface recession and catalytic recombination. NASAs Hypersonics Project is advancing the state-of-the-art in modeling of nonequilibrium phenomena by making detailed spectroscopic measurements in shock tube and arcjets, using ab-initio quantum mechanical techniques develop fundamental chemistry and spectroscopic databases, making fundamental measurements of finite-rate gas surface interactions, implementing of detailed mechanisms in the state-of-the-art CFD codes, The development of new models is based on validation with relevant experiments. We will present the latest developments and a roadmap for the technical areas mentioned above

  13. High-Throughput Process Development for Biopharmaceuticals.

    Science.gov (United States)

    Shukla, Abhinav A; Rameez, Shahid; Wolfe, Leslie S; Oien, Nathan

    2017-11-14

    The ability to conduct multiple experiments in parallel significantly reduces the time that it takes to develop a manufacturing process for a biopharmaceutical. This is particularly significant before clinical entry, because process development and manufacturing are on the "critical path" for a drug candidate to enter clinical development. High-throughput process development (HTPD) methodologies can be similarly impactful during late-stage development, both for developing the final commercial process as well as for process characterization and scale-down validation activities that form a key component of the licensure filing package. This review examines the current state of the art for HTPD methodologies as they apply to cell culture, downstream purification, and analytical techniques. In addition, we provide a vision of how HTPD activities across all of these spaces can integrate to create a rapid process development engine that can accelerate biopharmaceutical drug development. Graphical Abstract.

  14. Parametric description of the quantum measurement process

    Science.gov (United States)

    Liuzzo-Scorpo, P.; Cuccoli, A.; Verrucchi, P.

    2015-08-01

    We present a description of the measurement process based on the parametric representation with environmental coherent states. This representation is specifically tailored for studying quantum systems whose environment needs being considered through the quantum-to-classical crossover. Focusing upon projective measures, and exploiting the connection between large-N quantum theories and the classical limit of related ones, we manage to push our description beyond the pre-measurement step. This allows us to show that the outcome production follows from a global-symmetry breaking, entailing the observed system's state reduction, and that the statistical nature of the process is brought about, together with the Born's rule, by the macroscopic character of the measuring apparatus.

  15. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  16. Adequate Measuring Technology and System of Fission Gas release Behavior from Voloxidation Process

    International Nuclear Information System (INIS)

    Park, Geun Il; Park, J. J.; Jung, I. H.; Shin, J. M.; Yang, M. S.; Song, K. C.

    2006-09-01

    Based on the published literature and an understanding of available hot cell technologies, more accurate measuring methods for each volatile fission product released from voloxidation process were reviewed and selected. The conceptual design of an apparatus for measuring volatile and/or semi-volatile fission products released from spent fuel was prepared. It was identified that on-line measurement techniques can be applied for gamma-emitting fission products, and off-line measurement such as chemical/or neutron activation analysis can applied for analyzing beta-emitting fission gases. Collection methods using appropriate material or solutions were selected to measure the release fraction of beta-emitting gaseous fission products at IMEF M6 hot cell. Especially, the on-line gamma-ray counting system for monitoring of 85Kr and the off-line measuring system of 14C was established. On-line measuring system for obtaining removal ratios of the semi-volatile fission products, mainly gamma-emitting fission products such as Cs, Ru etc., was also developed at IMEF M6 hot cell which was based on by measuring fuel inventory before and after the voloxidation test through gamma measuring technique. The development of this measurement system may enable basic information to be obtained to support design of the off-gas treatment system for the voloxidation process at INL, USA

  17. The "Test of Financial Literacy": Development and Measurement Characteristics

    Science.gov (United States)

    Walstad, William B.; Rebeck, Ken

    2017-01-01

    The "Test of Financial Literacy" (TFL) was created to measure the financial knowledge of high school students. Its content is based on the standards and benchmarks stated in the "National Standards for Financial Literacy" (Council for Economic Education 2013). The test development process involved extensive item writing and…

  18. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    International Nuclear Information System (INIS)

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987

  19. Porosity Measurements and Analysis for Metal Additive Manufacturing Process Control.

    Science.gov (United States)

    Slotwinski, John A; Garboczi, Edward J; Hebenstreit, Keith M

    2014-01-01

    Additive manufacturing techniques can produce complex, high-value metal parts, with potential applications as critical metal components such as those found in aerospace engines and as customized biomedical implants. Material porosity in these parts is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants - since surface-breaking pores allows for better integration with biological tissue. Changes in a part's porosity during an additive manufacturing build may also be an indication of an undesired change in the build process. Here, we present efforts to develop an ultrasonic sensor for monitoring changes in the porosity in metal parts during fabrication on a metal powder bed fusion system. The development of well-characterized reference samples, measurements of the porosity of these samples with multiple techniques, and correlation of ultrasonic measurements with the degree of porosity are presented. A proposed sensor design, measurement strategy, and future experimental plans on a metal powder bed fusion system are also presented.

  20. Can we decide which outcomes should be measured in every clinical trial? A scoping review of the existing conceptual frameworks and processes to develop core outcome sets.

    Science.gov (United States)

    Idzerda, Leanne; Rader, Tamara; Tugwell, Peter; Boers, Maarten

    2014-05-01

    The usefulness of randomized control trials to advance clinical care depends upon the outcomes reported, but disagreement on the choice of outcome measures has resulted in inconsistency and the potential for reporting bias. One solution to this problem is the development of a core outcome set: a minimum set of outcome measures deemed critical for clinical decision making. Within rheumatology the Outcome Measures in Rheumatology (OMERACT) initiative has pioneered the development of core outcome sets since 1992. As the number of diseases addressed by OMERACT has increased and its experience in formulating core sets has grown, clarification and update of the conceptual framework and formulation of a more explicit process of area/domain core set development has become necessary. As part of the update process of the OMERACT Filter criteria to version 2, a literature review was undertaken to compare and contrast the OMERACT conceptual framework with others within and outside rheumatology. A scoping search was undertaken to examine the extent, range, and nature of conceptual frameworks for core set outcome selection in health. We searched the following resources: Cochrane Library Methods Group Register; Medline; Embase; PsycInfo; Environmental Studies and Policy Collection; and ABI/INFORM Global. We also conducted a targeted Google search. Five conceptual frameworks were identified: the WHO tripartite definition of health; the 5 Ds (discomfort, disability, drug toxicity, dollar cost, and death); the International Classification of Functioning (ICF); PROMIS (Patient-Reported Outcomes Measurement System); and the Outcomes Hierarchy. Of these, only the 5 Ds and ICF frameworks have been systematically applied in core set development. Outside the area of rheumatology, several core sets were identified; these had been developed through a limited range of consensus-based methods with varying degrees of methodological rigor. None applied a framework to ensure content validity of

  1. Implementation of Energy Strategies in Communities (Annex 63) Volume 2: Development of strategic measures

    DEFF Research Database (Denmark)

    Kellenberger, Daniel; Schmid, Christian; Quitzau, Maj-Britt

    This report describes the further development of the analysed measures from Volume 1 into strategic measures. As with the term measure, a strategic measure refers to an essential measure in concept that can be used to develop individual implementation strategies on a local level for part...... or the whole life cycle of a project (from the first vision to monitoring of the implemented solution). The developed strategic measures deal with the following topics: Setting Vision and Targets Developing Renewable Energy Strategies Making Full use of Legal Frameworks Designing an Urban Competition Processes...... a summary of each strategic measure supported by nine appendices, each a detailed description of each strategic measure....

  2. Performance measurement in healthcare: part II--state of the science findings by stage of the performance measurement process.

    Science.gov (United States)

    Adair, Carol E; Simpson, Elizabeth; Casebeer, Ann L; Birdsell, Judith M; Hayden, Katharine A; Lewis, Steven

    2006-07-01

    This paper summarizes findings of a comprehensive, systematic review of the peer-reviewed and grey literature on performance measurement according to each stage of the performance measurement process--conceptualization, selection and development, data collection, and reporting and use. It also outlines implications for practice. Six hundred sixty-four articles about organizational performance measurement from the health and business literature were reviewed after systematic searches of the literature, multi-rater relevancy ratings, citation checks and expert author nominations. Key themes were extracted and summarized from the most highly rated papers for each performance measurement stage. Despite a virtually universal consensus on the potential benefits of performance measurement, little evidence currently exists to guide practice in healthcare. Issues in conceptualizing systems include strategic alignment and scope. There are debates on the criteria for selecting measures and on the types and quality of measures. Implementation of data collection and analysis systems is complex and costly, and challenges persist in reporting results, preventing unintended effects and putting findings for improvement into action. There is a need for further development and refinement of performance measures and measurement systems, with a particular focus on strategies to ensure that performance measurement leads to healthcare improvement.

  3. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    International Nuclear Information System (INIS)

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  4. A method for automated processing of measurement information during mechanical drilling

    Energy Technology Data Exchange (ETDEWEB)

    Samonenko, V.I.; Belinkov, V.G.; Romanova, L.A.

    1984-01-01

    An algorithm is cited for a developed method for automated processing of measurement information during mechanical drilling. Its use in conditions of operation of an automated control system (ASU) from drilling will make it possible to precisely identify a change in the lithology, the physical and mechanical and the abrasive properties, in the stratum (pore) pressure in the rock being drilled out during mechanical drilling, which along with other methods for testing the drilling process will increase the reliability of the decisions made.

  5. Invariant measures of mass migration processes

    Czech Academy of Sciences Publication Activity Database

    Fajfrová, Lucie; Gobron, T.; Saada, E.

    2016-01-01

    Roč. 21, č. 1 (2016), s. 1-52, č. článku 60. ISSN 1083-6489 R&D Projects: GA ČR GAP201/12/2613; GA ČR(CZ) GA16-15238S Institutional support: RVO:67985556 Keywords : interacting particle systems * product invariant measures * zero range process * target process * mass migration process * condensation Subject RIV: BA - General Mathematics Impact factor: 0.904, year: 2016 http://library.utia.cas.cz/separaty/2016/SI/fajfrova-0464455.pdf

  6. The research of new type stratified water injection process intelligent measurement technology

    Science.gov (United States)

    Zhao, Xin

    2017-10-01

    To meet the needs of injection and development of Daqing Oilfield, the injection of oil from the early stage of general water injection to the subdivision of water is the purpose of improving the utilization degree and the qualified rate of water injection, improving the performance of water injection column and the matching process. Sets of suitable for high water content of the effective water injection technology supporting technology. New layered water injection technology intelligent measurement technology will be more information testing and flow control combined into a unified whole, long-term automatic monitoring of the work of the various sections, in the custom The process has the characteristics of "multi-layer synchronous measurement, continuous monitoring of process parameters, centralized admission data", which can meet the requirement of subdivision water injection, but also realize the automatic synchronization measurement of each interval, greatly improve the efficiency of tiered injection wells to provide a new means for the remaining oil potential.

  7. Fidelity of test development process within a national science grant

    Science.gov (United States)

    Brumfield, Teresa E.

    In 2002, a math-science partnership (MSP) program was initiated by a national science grant. The purpose of the MSP program was to promote the development, implementation, and sustainability of promising partnerships among institutions of higher education, K-12 schools and school systems, as well as other important stakeholders. One of the funded projects included a teacher-scientist collaborative that instituted a professional development system to prepare teachers to use inquiry-based instructional modules. The MSP program mandated evaluations of its funded projects. One of the teacher-scientist collaborative project's outcomes specifically focused on teacher and student science content and process skills. In order to provide annual evidence of progress and to measure the impact of the project's efforts, and because no appropriate science tests were available to measure improvements in content knowledge of participating teachers and their students, the project contracted for the development of science tests. This dissertation focused on the process of test development within an evaluation and examined planned (i.e., expected) and actual (i.e., observed) test development, specifically concentrating on the factors that affected the actual test development process. Planned test development was defined as the process of creating tests according to the well-established test development procedures recommended by the AERA/APA/NCME 1999 Standards for Educational and Psychological Testing. Actual test development was defined as the process of creating tests as it actually took place. Because case study provides an in-depth, longitudinal examination of an event (i.e., case) in a naturalistic setting, it was selected as the appropriate methodology to examine the difference between planned and actual test development. The case (or unit of analysis) was the test development task, a task that was bounded by the context in which it occurred---and over which this researcher had

  8. Optical fiber sensors for process refractometry and temperature measuring based on curved fibers

    International Nuclear Information System (INIS)

    Willsch, R.; Schwotzer, G.; Haubenreisser, W.; Jahn, J.U.

    1986-01-01

    Based on U-shape curved multimode fibers with defined bending radii intensity-modulated optical sensors for the determination of refractive index changes in liquids and related measurands (solution concentration, mixing ratio and others) in process-refractometry and for temperature measuring under special environmental conditions have been developed. The optoelectronic transmitting and receiving units are performed in modular technique and can be used in multi-purpose applications. The principles, performance and characteristical properties of these sensors are described and their possibilities of application in process measuring and automation are discussed by some selected examples. (orig.) [de

  9. Optical fiber sensors for process refractometry and temperature measuring based on curved fibers

    Energy Technology Data Exchange (ETDEWEB)

    Willsch, R; Schwotzer, G; Haubenreisser, W; Jahn, J U

    1986-01-01

    Based on U-shape curved multimode fibers with defined bending radii intensity-modulated optical sensors for the determination of refractive index changes in liquids and related measurands (solution concentration, mixing ratio and others) in process-refractometry and for temperature measuring under special environmental conditions have been developed. The optoelectronic transmitting and receiving units are performed in modular technique and can be used in multi-purpose applications. The principles, performance and characteristical properties of these sensors are described and their possibilities of application in process measuring and automation are discussed by some selected examples.

  10. Reversing a Negative Measurement in Process with Negative Events: A Haunted Negative Measurement and the Bifurcation of Time

    CERN Document Server

    Snyder, D M

    2003-01-01

    Reversing an ordinary measurement in process (a haunted measurement) is noted and the steps involved in reversing a negative measurement in process (a haunted negative measurement) are described. In order to discuss in a thorough manner reversing an ordinary measurement in process, one has to account for how reversing a negative measurement in process would work for the same experimental setup. The reason it is necessary to know how a negative measurement in process is reversed is because for a given experimental setup there is no physical distinction between reversing a negative measurement in process and reversing an ordinary measurement in process. In the absence of the reversal of a negative measurement in process in the same experimental setup that supports the reversal of an ordinary measurement in process, the possibility exists of which-way information concerning the negative measurement that would render theoretically implausible reversing an ordinary measuremnt in process. The steps in reversing a n...

  11. Measuring and Calculative Complex for Registration of Quasi-Static and Dynamic Processes of Electromagnetic Irradiation

    Directory of Open Access Journals (Sweden)

    V. I. Ovchinnikov

    2007-01-01

    Full Text Available The paper is devoted to the development of measuring device to register dynamic processes of electromagnetic irradiation during the treatment of materials with energy of explosion. Standard units to register main parameters of the explosion do not allow predict and control results of the process. So, to overcome disadvantages of former control units a new one has been developed applying Hall’s sensors. The device developed allows effectively register of the inductive component of the electromagnetic irradiation in wide range of temperature for many shot-time processes.

  12. Development and Design of a Flexible Measurement System for Offshore Wind Farm

    DEFF Research Database (Denmark)

    Kocewiak, Lukasz Hubert; Arana Aristi, Iván; Hjerrild, Jesper

    The development process of a flexible measurement system for multi-point, high-speed and long-term offshore data logging is described in this paper. This covers the complete design taking into account precise synchronisation, electromagnetic compatibility, software development and sensor...... calibration. The presented measurement set-up was tested in a rough offshore environment. Results from measurement campaigns at Avedøre and Gunfleet Sands offshore wind farms including synchronisation precision and accuracy, electromagnetic interference of power electronic devices are briefly presented....

  13. Development and Design of a Flexible Measurement System for Offshore Wind Farms

    DEFF Research Database (Denmark)

    Kocewiak, Lukasz Hubert; Arana Aristi, Ivan; Hjerrild, Jesper

    2011-01-01

    The development process of a flexible measurement system for multi-point, high-speed and long-term offshore data logging is described in this paper. This covers the complete design taking into account precise synchronisation, electromagnetic compatibility, software development and sensor...... calibration. The presented measurement set-up was tested in a rough offshore environment. Results from measurement campaigns at Avedøre and Gunfleet Sands offshore wind farms including synchronisation precision and accuracy, electromagnetic interference of power electronic devices are briefly presented....

  14. The development process and tendency of nuclear instruments applied in industry

    International Nuclear Information System (INIS)

    Ji Changsong

    2005-01-01

    The development process of nuclear technique application in industry may be divided into three stages: early stage--density, thickness and level measurement; middle stage--neutron moisture, ash content and X-ray fluorescence analysis; recent state--container inspection and industrial CT, nuclear magnetic resonance, neutron capture and non-elastic collision analysis techniques. The development tendency of nuclear instruments applied in industry is: spectrum measurement; detector array and image technique; nuclide analysis and new kinds of nuclear detectors are widely adopted. (authors)

  15. Developing a performance measurement system for public research centres

    Directory of Open Access Journals (Sweden)

    Masella, C.

    2012-01-01

    Full Text Available This study aims at developing a performance measurement system (PMS for research and development (R&D activities carried out by public research centres. Public research institutions are characterized by multiple stakeholders with different needs, and the management of R&D activities requires balancing the multiple goals of different stakeholders. This characteristic is a key issue in the process of construction of the PMS. Empirical evidence is provided by an Italian public research centre, where the researchers carried out a project aimed to develop a PMS following action research principles. This project gave the possibility to researchers to interact with different stakeholders and integrate their different information needs in a comprehensive set of key performance indicators (KPIs. As a result, multidimensional framework for measuring R&D performance in a public research centre is proposed and a set of Key Performance Indicators is developed, suggesting implications for academics and practitioners.

  16. Achievement report for fiscal 1998. Photon-aided measuring and processing technologies (Research and development of advanced measuring and processing technologies for oil production system); 1998 nendo foton keisoku kako gijutsu seika hokokusho. Sekiyu seisan system kodo keisoku kako gijutsu kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    Research and development was conducted of photon-aided measuring and processing technologies and of the technology of generating photons, all these usable in oil digging. In the research and development of high-reliability laser welding technology, it was found that reflection from the keyhole served as an indicator betraying instability if any in laser welding, and this enabled the establishment of a method for observing defects in welded sections. In the research and development of technologies for measuring internal transmission of short-wavelength radiation and for photoelectron spectroscopy, a high-efficiency continuous generation of hard X-rays was noticed in the former while, in the latter, the soft X-ray source in a light source was evaluated. In addition, a photoelectron flight tube was fabricated, and evaluated. In the research and development of fiber laser, a fiber structure type was decided to be used, and fiber laser was successfully oscillated for the first time in the world. In the research and development of high-performance laser diodes, a laser diode device was fabricated as the first step in this effort. A discussion was made about a high-efficiency combination of heat sink materials, lamination and assembly system, and fibers, which was for achieving higher performance. (NEDO)

  17. Habituation of the orienting reflex and the development of Preliminary Process Theory.

    Science.gov (United States)

    Barry, Robert J

    2009-09-01

    The orienting reflex (OR), elicited by an innocuous stimulus, can be regarded as a model of the organism's interaction with its environment, and has been described as the unit of attentional processing. A major determinant of the OR is the novelty of the eliciting stimulus, generally operationalized in terms of its reduction with stimulus repetition, the effects of which are commonly described in habituation terms. This paper provides an overview of a research programme, spanning more than 30 years, investigating psychophysiological aspects of the OR in humans. The major complication in this research is that the numerous physiological measures used as dependent variables in the OR context fail to jointly covary with stimulus parameters. This has led to the development of the Preliminary Process Theory (PPT) of the OR to accommodate the complexity of the observed stimulus-response patterns. PPT is largely grounded in autonomic measures, and current work is attempting to integrate electroencephalographic measures, particularly components in the event-related brain potentials reflecting aspects of stimulus processing. The emphasis in the current presentation is on the use of the defining criteria of the habituation phenomenon, and Groves and Thompson's Dual-process Theory, in the development of PPT.

  18. Environmentally development sustainable Measurement

    International Nuclear Information System (INIS)

    Correa Pinzon, Hector Jaime

    1996-01-01

    One of the topics of more present time in the national and international environment has to do with the environment and all circumstances that surround it. The public accountants are involved direct or indirectly with the environmental handling, this profession has a great incidence in many aspects of this topic. The environmental development has to do with several such aspects as inequality and poverty, the incalculable human resource, the same environment, the social, political and cultural aspects and some indicators that have to do with the same development. All the proposals that they have to do with the environmental development they don't stop to be simply index normalized, it is to include non-monetary elements of the well being toward the leading of the development politicians. Such events as environmental costs, environmental control, industrial processes, human resources and others of great importance possess continuous and permanent relationship with the public accounting. For this reason it has been to analyze environmental aspects, with the purpose of investigating what documentation and advances exist in other countries, to be able to show some light to the interested, and this way to develop some hypotheses that can be in turn elements of integration technician-accountant jointly. The measurements of the entrance and the total product of nation, they give an extremely imperfect indication of their well -being. Besides the holes so well well-known of their covering, as the domestic work not remunerated, it is necessary to know at least another group of information to be able to emit a conclusive trial about the tendencies of the human well-being

  19. Developing a demand side management strategic framework through a collaborative [process

    International Nuclear Information System (INIS)

    Kostler, J.

    1992-01-01

    Alberta Power Ltd. is developing a demand side management (DSM) strategic framework through a collaborative process that began in September 1991. The process is seen to have the advantages of involving customers in DSM issues, giving them the opportunity to determine the outcome, being less confrontational, and having the capability of arriving at solutions unattainable through other processes. Issues being considered in the collaborative process include cost effectiveness, externalities, screening and analyzing of DSM measures, cost allocation and recovery, DSM lost revenues, the utility role in DSM, measurement criteria, and incentives. The process includes day-long meetings of a 12-member collaborative group comprising representatives from Alberta Power, government agencies, industry and municipal associations, and environmental and consumer organizations. A professional facilitator and an expert consultant from outside Alberta Power were employed to support the collaborative process. The process is working well and is on track to present the utility with a strategic framework to deal with DSM

  20. A natural language processing pipeline for pairing measurements uniquely across free-text CT reports.

    Science.gov (United States)

    Sevenster, Merlijn; Bozeman, Jeffrey; Cowhy, Andrea; Trost, William

    2015-02-01

    To standardize and objectivize treatment response assessment in oncology, guidelines have been proposed that are driven by radiological measurements, which are typically communicated in free-text reports defying automated processing. We study through inter-annotator agreement and natural language processing (NLP) algorithm development the task of pairing measurements that quantify the same finding across consecutive radiology reports, such that each measurement is paired with at most one other ("partial uniqueness"). Ground truth is created based on 283 abdomen and 311 chest CT reports of 50 patients each. A pre-processing engine segments reports and extracts measurements. Thirteen features are developed based on volumetric similarity between measurements, semantic similarity between their respective narrative contexts and structural properties of their report positions. A Random Forest classifier (RF) integrates all features. A "mutual best match" (MBM) post-processor ensures partial uniqueness. In an end-to-end evaluation, RF has precision 0.841, recall 0.807, F-measure 0.824 and AUC 0.971; with MBM, which performs above chance level (P0.960) indicates that the task is well defined. Domain properties and inter-section differences are discussed to explain superior performance in abdomen. Enforcing partial uniqueness has mixed but minor effects on performance. A combined machine learning-filtering approach is proposed for pairing measurements, which can support prospective (supporting treatment response assessment) and retrospective purposes (data mining). Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Process for the interference measurement of an object

    International Nuclear Information System (INIS)

    Bryngdahl, O.

    1977-01-01

    The invention concerns a process for the interference measurement of an object and has the task of indicating its phase-related properties with as high a resolution as possible. For this purpose a hologram is drawn of the object to be measured, with which an interference grating is produced. The holograph splits the coherent light coming from a laser into two part rays, in a well known way; one of these rays passes through the object and then together with the other one falls on the indicating element. There both parts of the ray are reunited, where the angle of unification s of this arrangement is about 45 0 . After the hologram has been developed, it is placed in the interferometer. A parallel ray of coherent light passes through the hologram and produces two pictures, one orthoscopic and one pseudoscopic. These two pictures are combined, so that an interference pattern is created, which reproduces the phase variations of the object, with twice the resolution, as the angle between the two rays is twice as large as the unification angle s. Further processes are given which can multiply the angle and therefore the resolution by four and by six times. (ORU) [de

  2. Application of gamma-ray densitometry in developing primary upgrading processes

    International Nuclear Information System (INIS)

    Liu, D.D.S.; Patmore, D.J.

    1990-01-01

    Gamma-ray densitometry has been applied in developing processes for upgrading heavy oils, refinery residua, tar sand bitumen and coal into synthetic crudes. These processes normally operate at high temperatures and pressures, thus non-invasive monitors are highly desirable. Examples of applications at CANMET are given for the following three areas: gas-liquid and gas-liquid-solid multiphase flow hydrodynamic studies, monitoring of ash concentration and measurement of thermal expansion coefficient of liquids. (author)

  3. Measuring Down: Evaluating Digital Storytelling as a Process for Narrative Health Promotion.

    Science.gov (United States)

    Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah; DiFulvio, Gloria; Del Toro-Mejías, Lizbeth

    2016-05-15

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. We present key evaluation findings of a 2-year, mixed-methods study that focused on effects of participating in the DST process on young Puerto Rican Latina's self-esteem, social support, empowerment, and sexual attitudes and behaviors. Quantitative results did not show significant changes in the expected outcomes. However, in our qualitative findings we identified several ways in which the DST made positive, health-bearing effects. We argue for the importance of "measuring down" to reflect the locally grounded, felt experiences of participants who engage in the process, as current quantitative scales do not "measure up" to accurately capture these effects. We end by suggesting the need to develop mixed-methods, culturally relevant, and sensitive evaluation tools that prioritize process effects as they inform intervention and health promotion. © The Author(s) 2016.

  4. Component-based development process and component lifecycle

    NARCIS (Netherlands)

    Crnkovic, I.; Chaudron, M.R.V.; Larsson, S.

    2006-01-01

    The process of component- and component-based system development differs in many significant ways from the "classical" development process of software systems. The main difference is in the separation of the development process of components from the development process of systems. This fact has a

  5. Process-oriented performance indicators for measuring ecodesign management practices

    DEFF Research Database (Denmark)

    Rodrigues, Vinicius Picanco; Pigosso, Daniela Cristina Antelmi; McAloone, Tim C.

    2016-01-01

    In order to support ecodesign performance measurement from a business perspective, this paper performs an exploration of available process-oriented indicators to be applied to ecodesign management practices. With the Ecodesign Maturity Model as a background framework, a systematic literature review...... coupled with a cross-content analysis was carried out to assign proper indicators to the practices. Results show that the currently available indicators do not fully reflect the characteristics of ecodesign and there is significant room for improving the development of tailor-made indicators....

  6. Development of simplified nuclear dry plate measuring system

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Y; Ohta, I [Utsunomiya Univ. (Japan). Faculty of Education; Tezuka, I; Tezuka, T; Makino, K

    1981-08-01

    A simplified nuclear dry plate measuring system was developed. The system consists of a microscope, an ITV camera, a monitor TV, an XY tracker and a micro-computer. The signals of the images of tracks in a nuclear dry plate are sent to the XY tracker, and shown on the monitor TV. The XY tracker displays a pointer on the monitor TV, and makes the output signal of its XY coordinate. This output signal is analyzed by the microcomputer. The software for the measuring process is composed of a program system written in BASIC and the machine language. The data in take, the expansion of the range of measurement and the output of analyzed data are controlled by the program. The accuracy of the measurement of coordinate was studied, and was about 0.39 micrometer for 10 micrometer distance.

  7. Real-time interferometric monitoring and measuring of photopolymerization based stereolithographic additive manufacturing process: sensor model and algorithm

    International Nuclear Information System (INIS)

    Zhao, X; Rosen, D W

    2017-01-01

    As additive manufacturing is poised for growth and innovations, it faces barriers of lack of in-process metrology and control to advance into wider industry applications. The exposure controlled projection lithography (ECPL) is a layerless mask-projection stereolithographic additive manufacturing process, in which parts are fabricated from photopolymers on a stationary transparent substrate. To improve the process accuracy with closed-loop control for ECPL, this paper develops an interferometric curing monitoring and measuring (ICM and M) method which addresses the sensor modeling and algorithms issues. A physical sensor model for ICM and M is derived based on interference optics utilizing the concept of instantaneous frequency. The associated calibration procedure is outlined for ICM and M measurement accuracy. To solve the sensor model, particularly in real time, an online evolutionary parameter estimation algorithm is developed adopting moving horizon exponentially weighted Fourier curve fitting and numerical integration. As a preliminary validation, simulated real-time measurement by offline analysis of a video of interferograms acquired in the ECPL process is presented. The agreement between the cured height estimated by ICM and M and that measured by microscope indicates that the measurement principle is promising as real-time metrology for global measurement and control of the ECPL process. (paper)

  8. Application of dielectric constant measurement in microwave sludge disintegration and wastewater purification processes.

    Science.gov (United States)

    Kovács, Petra Veszelovszki; Lemmer, Balázs; Keszthelyi-Szabó, Gábor; Hodúr, Cecilia; Beszédes, Sándor

    2018-05-01

    It has been numerously verified that microwave radiation could be advantageous as a pre-treatment for enhanced disintegration of sludge. Very few data related to the dielectric parameters of wastewater of different origins are available; therefore, the objective of our work was to measure the dielectric constant of municipal and meat industrial wastewater during a continuous flow operating microwave process. Determination of the dielectric constant and its change during wastewater and sludge processing make it possible to decide on the applicability of dielectric measurements for detecting the organic matter removal efficiency of wastewater purification process or disintegration degree of sludge. With the measurement of dielectric constant as a function of temperature, total solids (TS) content and microwave specific process parameters regression models were developed. Our results verified that in the case of municipal wastewater sludge, the TS content has a significant effect on the dielectric constant and disintegration degree (DD), as does the temperature. The dielectric constant has a decreasing tendency with increasing temperature for wastewater sludge of low TS content, but an adverse effect was found for samples with high TS and organic matter contents. DD of meat processing wastewater sludge was influenced significantly by the volumetric flow rate and power level, as process parameters of continuously flow microwave pre-treatments. It can be concluded that the disintegration process of food industry sludge can be detected by dielectric constant measurements. From technical purposes the applicability of dielectric measurements was tested in the purification process of municipal wastewater, as well. Determination of dielectric behaviour was a sensitive method to detect the purification degree of municipal wastewater.

  9. Fiscal 1999 research report. Research on photonic measurement and processing technology (Development of high- efficiency production process technology); 1999 nendo foton keisoku kako gijutsu seika hokokusho. Kokoritsu seisan process gijutsu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    This report summarizes the fiscal 1999 research result on R and D of laser processing technology, in-situ measurement technology, and generation and control technology of photon as laser beam source, for energy saving and efficiency improvement of energy-consumptive production processes such as welding, jointing, surface treatment and fine particle fabrication. The research was carried out by a technical center, 9 companies and a university as contract research. The research themes are as follows: (1) Processing technology: simulation technology for laser welding phenomena, synthesis technology for quantum dot functional structures, and fabrication technology for functional composite materials, (2) In-situ measurement technology: fine particle element and size measurement technology, (3) All- solid state laser technology: efficient rod type LD-pumping laser module, pumping chamber of slab type laser, improvement of E/O efficiency of laser diode, high-quality nonlinear crystal growth technology, fabrication technology for nonlinear crystals, and high-efficiency harmonic generation technology. Comprehensive survey was also made on high- efficiency photon generation technologies. (NEDO)

  10. Development of fast measurements of concentration of NORM U-238 by HPGe

    Science.gov (United States)

    Cha, Seokki; Kim, Siu; Kim, Geehyun

    2017-02-01

    Naturally Occureed Radioactive Material (NORM) generated from the origin of earth can be found all around us and even people who are not engaged in the work related to radiation have been exposed to unnecessary radiation. This NORM has a potential risk provided that is concentrated or transformed by artificial activities. Likewise, a development of fast measruement method of NORM is emerging to prevent the radiation exposure of the general public and person engaged in the work related to the type of business related thereto who uses the material in which NORM is concentrated or transfromed. Based on such a background, many of countries have tried to manage NORM and carried out regulatory legislation. To effienctly manage NORM, there is need for developing new measurement to quickly and accurately analyze the nuclide and concentration. In this study, development of the fast and reliable measurement was carried out. In addition to confirming the reliability of the fast measurement, we have obtained results that can suggest the possibility of developing another fast measurement. Therefore, as a follow-up, it is possible to develop another fast analytical measurement afterwards. The results of this study will be very useful for the regulatory system to manage NORM. In this study, a review of two indirect measurement methods of NORM U-238 that has used HPGe on the basis of the equilibrium theory of relationships of mother and daughter nuclide at decay-chain of NORM U-238 has been carried out. For comparative study(in order to know reliabily), direct measurement that makes use of alpha spectrometer with complicated pre-processing process was implemented.

  11. Development of balanced key performance indicators for emergency departments strategic dashboards following analytic hierarchical process.

    Science.gov (United States)

    Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh

    2014-01-01

    Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.

  12. 3D physical modeling for patterning process development

    Science.gov (United States)

    Sarma, Chandra; Abdo, Amr; Bailey, Todd; Conley, Will; Dunn, Derren; Marokkey, Sajan; Talbi, Mohamed

    2010-03-01

    In this paper we will demonstrate how a 3D physical patterning model can act as a forensic tool for OPC and ground-rule development. We discuss examples where the 2D modeling shows no issues in printing gate lines but 3D modeling shows severe resist loss in the middle. In absence of corrective measure, there is a high likelihood of line discontinuity post etch. Such early insight into process limitations of prospective ground rules can be invaluable for early technology development. We will also demonstrate how the root cause of broken poly-line after etch could be traced to resist necking in the region of STI step with the help of 3D models. We discuss different cases of metal and contact layouts where 3D modeling gives an early insight in to technology limitations. In addition such a 3D physical model could be used for early resist evaluation and selection for required ground-rule challenges, which can substantially reduce the cycle time for process development.

  13. Development and Validation of an Index to Measure the Quality of Facility-Based Labor and Delivery Care Processes in Sub-Saharan Africa.

    Directory of Open Access Journals (Sweden)

    Vandana Tripathi

    Full Text Available High quality care is crucial in ensuring that women and newborns receive interventions that may prevent and treat birth-related complications. As facility deliveries increase in developing countries, there are concerns about service quality. Observation is the gold standard for clinical quality assessment, but existing observation-based measures of obstetric quality of care are lengthy and difficult to administer. There is a lack of consensus on quality indicators for routine intrapartum and immediate postpartum care, including essential newborn care. This study identified key dimensions of the quality of the process of intrapartum and immediate postpartum care (QoPIIPC in facility deliveries and developed a quality assessment measure representing these dimensions.Global maternal and neonatal care experts identified key dimensions of QoPIIPC through a modified Delphi process. Experts also rated indicators of these dimensions from a comprehensive delivery observation checklist used in quality surveys in sub-Saharan African countries. Potential QoPIIPC indices were developed from combinations of highly-rated indicators. Face, content, and criterion validation of these indices was conducted using data from observations of 1,145 deliveries in Kenya, Madagascar, and Tanzania (including Zanzibar. A best-performing index was selected, composed of 20 indicators of intrapartum/immediate postpartum care, including essential newborn care. This index represented most dimensions of QoPIIPC and effectively discriminated between poorly and well-performed deliveries.As facility deliveries increase and the global community pays greater attention to the role of care quality in achieving further maternal and newborn mortality reduction, the QoPIIPC index may be a valuable measure. This index complements and addresses gaps in currently used quality assessment tools. Further evaluation of index usability and reliability is needed. The availability of a streamlined

  14. Measurement of Learning Process by Semantic Annotation Technique on Bloom's Taxonomy Vocabulary

    Science.gov (United States)

    Yanchinda, Jirawit; Yodmongkol, Pitipong; Chakpitak, Nopasit

    2016-01-01

    A lack of science and technology knowledge understanding of most rural people who had the highest education at elementary education level more than others level is unsuccessfully transferred appropriate technology knowledge for rural sustainable development. This study provides the measurement of the learning process by on Bloom's Taxonomy…

  15. Mechanistic Models for Process Development and Optimization of Fed-batch Fermentation Systems

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads O.

    2016-01-01

    This work discusses the application of mechanistic models to pilot scale filamentous fungal fermentation systems operated at Novozymes A/S. For on-line applications, a state estimator model is developed based on a stoichiometric balance in order to predict the biomass and product concentration....... This is based on on-line gas measurements and ammonia addition flow rate measurements. Additionally, a mechanistic model is applied offline as a tool for batch planning, based on definition of the process back pressure, aeration rate and stirrer speed. This allows the batch starting fill to be planned, taking...... into account the oxygen transfer conditions, as well as the evaporation rates of the system. Mechanistic models are valuable tools which are applicable for both process development and optimization. The state estimator described will be a valuable tool for future work as part of control strategy development...

  16. Temporal response methods for dynamic measurement of in-process inventory of dissolved nuclear materials

    International Nuclear Information System (INIS)

    Zivi, S.M.; Seefeldt, W.B.

    1976-01-01

    This analysis demonstrated that a plant's temporal response to perturbations of feed isotope composition can be used to measure the in-process inventory, without suspending plant operations. The main advantage of the temporal response technique over the step-displacement method are (1) it obviates the need for large special feed batches and (2) it obviates the requirement that all the in-process material have a uniform isotopic composition at the beginning of the measurement. The temporal response method holds promise for essentially continuous real-time determination of in-process SNM. The main disadvantage or problem with the temporal response method is that it requires the measurement of the isotopic composition of a great many samples to moderately high accuracy. This requirement appears amenable to solution by a modest effort in instrument development

  17. Behavioral similarity measurement based on image processing for robots that use imitative learning

    Science.gov (United States)

    Sterpin B., Dante G.; Martinez S., Fernando; Jacinto G., Edwar

    2017-02-01

    In the field of the artificial societies, particularly those are based on memetics, imitative behavior is essential for the development of cultural evolution. Applying this concept for robotics, through imitative learning, a robot can acquire behavioral patterns from another robot. Assuming that the learning process must have an instructor and, at least, an apprentice, the fact to obtain a quantitative measurement for their behavioral similarity, would be potentially useful, especially in artificial social systems focused on cultural evolution. In this paper the motor behavior of both kinds of robots, for two simple tasks, is represented by 2D binary images, which are processed in order to measure their behavioral similarity. The results shown here were obtained comparing some similarity measurement methods for binary images.

  18. Biochemical Process Development and Integration | Bioenergy | NREL

    Science.gov (United States)

    Biochemical Process Development and Integration Biochemical Process Development and Integration Our conversion and separation processes to pilot-scale integrated process development and scale up. We also Publications Accounting for all sugar produced during integrated production of ethanol from lignocellulosic

  19. Process control program development

    International Nuclear Information System (INIS)

    Dameron, H.J.

    1985-01-01

    This paper details the development and implementation of a ''Process Control Program'' at Duke Power's three nuclear stations - Oconee, McGuire, and Catawba. Each station is required by Technical Specification to have a ''Process Control Program'' (PCP) to control all dewatering and/or solidification activities for radioactive wastes

  20. Business process performance measurement: a structured literature review of indicators, measures and metrics.

    Science.gov (United States)

    Van Looy, Amy; Shafagatova, Aygun

    2016-01-01

    Measuring the performance of business processes has become a central issue in both academia and business, since organizations are challenged to achieve effective and efficient results. Applying performance measurement models to this purpose ensures alignment with a business strategy, which implies that the choice of performance indicators is organization-dependent. Nonetheless, such measurement models generally suffer from a lack of guidance regarding the performance indicators that exist and how they can be concretized in practice. To fill this gap, we conducted a structured literature review to find patterns or trends in the research on business process performance measurement. The study also documents an extended list of 140 process-related performance indicators in a systematic manner by further categorizing them into 11 performance perspectives in order to gain a holistic view. Managers and scholars can consult the provided list to choose the indicators that are of interest to them, considering each perspective. The structured literature review concludes with avenues for further research.

  1. The effects of mild and severe traumatic brain injury on speed of information processing as measured by the computerized tests of information processing (CTIP).

    Science.gov (United States)

    Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra

    2007-01-01

    In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.

  2. Ground robotic measurement of aeolian processes

    Science.gov (United States)

    Qian, Feifei; Jerolmack, Douglas; Lancaster, Nicholas; Nikolich, George; Reverdy, Paul; Roberts, Sonia; Shipley, Thomas; Van Pelt, R. Scott; Zobeck, Ted M.; Koditschek, Daniel E.

    2017-08-01

    Models of aeolian processes rely on accurate measurements of the rates of sediment transport by wind, and careful evaluation of the environmental controls of these processes. Existing field approaches typically require intensive, event-based experiments involving dense arrays of instruments. These devices are often cumbersome and logistically difficult to set up and maintain, especially near steep or vegetated dune surfaces. Significant advances in instrumentation are needed to provide the datasets that are required to validate and improve mechanistic models of aeolian sediment transport. Recent advances in robotics show great promise for assisting and amplifying scientists' efforts to increase the spatial and temporal resolution of many environmental measurements governing sediment transport. The emergence of cheap, agile, human-scale robotic platforms endowed with increasingly sophisticated sensor and motor suites opens up the prospect of deploying programmable, reactive sensor payloads across complex terrain in the service of aeolian science. This paper surveys the need and assesses the opportunities and challenges for amassing novel, highly resolved spatiotemporal datasets for aeolian research using partially-automated ground mobility. We review the limitations of existing measurement approaches for aeolian processes, and discuss how they may be transformed by ground-based robotic platforms, using examples from our initial field experiments. We then review how the need to traverse challenging aeolian terrains and simultaneously make high-resolution measurements of critical variables requires enhanced robotic capability. Finally, we conclude with a look to the future, in which robotic platforms may operate with increasing autonomy in harsh conditions. Besides expanding the completeness of terrestrial datasets, bringing ground-based robots to the aeolian research community may lead to unexpected discoveries that generate new hypotheses to expand the science

  3. Development of reconfigurable analog and digital circuits for plasma diagnostics measurement systems

    International Nuclear Information System (INIS)

    Srivastava, Amit Kumar; Sharma, Atish; Raval, Tushar

    2009-01-01

    In long pulse discharge tokamak, a large number of diagnostic channels are being used to understand the complex behavior of plasma. Different diagnostics demand different types of analog and digital processing for plasma parameters measurement. This leads to variable requirements of signal processing for diagnostic measurement. For such types of requirements, we have developed hardware with reconfigurable electronic devices, which provide flexible solution for rapid development of measurement system. Here the analog processing is achieved by Field Programmable Analog Array (FPAA) integrated circuit while reconfigurable digital devices (CPLD/FPGA) achieve digital processing. FPAA's provide an ideal integrated platform for implementing low to medium complexity analog signal processing. With dynamic reconfigurability, the functionality of the FPAA can be reconfigured in-system by the designer or on the fly by a microprocessor. This feature is quite useful to manipulate the tuning or the construction of any part of the analog circuit without interrupting operation of the FPAA, thus maintaining system integrity. The hardware operation control logic circuits are configured in the reconfigurable digital devices (CPLD/FPGA) to control proper hardware functioning. These reconfigurable devices provide the design flexibility and save the component space on the board. It also provides the flexibility for various setting through software. The circuit controlling commands are either issued by computer/processor or generated by circuit itself. (author)

  4. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    Science.gov (United States)

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  5. Development of Stand Alone Application Tool for Processing and Quality Measurement of Weld Imperfection Image Captured by μ-Focused Digital Radiography Using MATLAB- Based Graphical User Interface

    Directory of Open Access Journals (Sweden)

    PZ Nadila

    2012-12-01

    Full Text Available Digital radiography incresingly is being applied in the fabrication industry. Compared to film- based radiography, digitally radiographed images can be acquired with less time and fewer exposures. However, noises can simply occur on the digital image resulting in a low-quality result. Due to this and the system’s complexity, parameters’ sensitivity, and environmental effects, the results can be difficult to interpret, even for a radiographer. Therefore, the need of an application tool to improve and evaluate the image is becoming urgent. In this research, a user-friendly tool for image processing and image quality measurement was developed. The resulting tool contains important components needed by radiograph inspectors in analyzing defects and recording the results. This tool was written by using image processing and the graphical user interface development environment and compiler (GUIDE toolbox available in Matrix Laboratory (MATLAB R2008a. In image processing methods, contrast adjustment, and noise removal, edge detection was applied. In image quality measurement methods, mean square error (MSE, peak signal-to-noise ratio (PSNR, modulation transfer function (MTF, normalized signal-to-noise ratio (SNRnorm, sensitivity and unsharpness were used to measure the image quality. The graphical user interface (GUI wass then compiled to build a Windows, stand-alone application that enables this tool to be executed independently without the installation of MATLAB.

  6. Developing Effective Performance Measures

    Science.gov (United States)

    2014-10-14

    University When Performance Measurement Goes Bad Laziness Vanity Narcissism Too Many Pettiness Inanity 52 Developing Effective...Kasunic, October 14, 2014 © 2014 Carnegie Mellon University Narcissism Measuring performance from the organization’s point of view, rather than from

  7. Bidirectional Classical Stochastic Processes with Measurements and Feedback

    Science.gov (United States)

    Hahne, G. E.

    2005-01-01

    A measurement on a quantum system is said to cause the "collapse" of the quantum state vector or density matrix. An analogous collapse occurs with measurements on a classical stochastic process. This paper addresses the question of describing the response of a classical stochastic process when there is feedback from the output of a measurement to the input, and is intended to give a model for quantum-mechanical processes that occur along a space-like reaction coordinate. The classical system can be thought of in physical terms as two counterflowing probability streams, which stochastically exchange probability currents in a way that the net probability current, and hence the overall probability, suitably interpreted, is conserved. The proposed formalism extends the . mathematics of those stochastic processes describable with linear, single-step, unidirectional transition probabilities, known as Markov chains and stochastic matrices. It is shown that a certain rearrangement and combination of the input and output of two stochastic matrices of the same order yields another matrix of the same type. Each measurement causes the partial collapse of the probability current distribution in the midst of such a process, giving rise to calculable, but non-Markov, values for the ensuing modification of the system's output probability distribution. The paper concludes with an analysis of a classical probabilistic version of the so-called grandfather paradox.

  8. Development of FPGA-based digital signal processing system for radiation spectroscopy

    International Nuclear Information System (INIS)

    Lee, Pil Soo; Lee, Chun Sik; Lee, Ju Hahn

    2013-01-01

    We have developed an FPGA-based digital signal processing system that performs both online digital signal filtering and pulse-shape analysis for both particle and gamma-ray spectroscopy. Such functionalities were made possible by a state-of-the-art programmable logic device and system architectures employed. The system performance as measured, for example, in the system dead time and accuracy for pulse-height and rise-time determination, was evaluated with standard alpha- and gamma-ray sources using a CsI(Tl) scintillation detector. It is resulted that the present system has shown its potential application to various radiation-related fields such as particle identification, radiography, and radiation imaging. - Highlights: ► An FPGA-based digital processing system was developed for radiation spectroscopy. ► Our digital system has a 14-bit resolution and a 100-MHz sampling rate. ► The FPGA implements the online digital filtering and pulse-shape analysis. ► The pileup rejection is implemented in trigger logic before digital filtering process. ► Our digital system was verified in alpha-gamma measurements using a CsI detector

  9. Evaluating the effectiveness of British Columbia's environmental assessment process for first nations' participation in mining development

    International Nuclear Information System (INIS)

    Baker, Douglas C.; McLelland, James N.

    2003-01-01

    This paper applies effectiveness as a criterion to measure the participation of First Nations' participation in British Columbia's environmental assessment process. Effectiveness is reviewed as a means to measure policy implementation and an expanded framework is proposed to measure effectiveness. The framework is applied to three case studies in north-central British Columbia to measure the effectiveness of First Nations' participation in the EA process for mining development. All three cases failed to achieve procedural, substantive, and transactive efficacy and thereby failed to meet overall policy effectiveness. The policies used by the British Columbia government, including the relatively recent Environmental Assessment Act (1995), reflect a poor integration of First Nations people in the EA decision-making process with respect to mine development

  10. Prospects for direct neutron capture measurements on s-process branching point isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, C.; Lerendegui-Marco, J.; Quesada, J.M. [Universidad de Sevilla, Dept. de Fisica Atomica, Molecular y Nuclear, Sevilla (Spain); Domingo-Pardo, C. [CSIC-Universidad de Valencia, Instituto de Fisica Corpuscular, Valencia (Spain); Kaeppeler, F. [Karlsruhe Institute of Technology, Institut fuer Kernphysik, Karlsruhe (Germany); Palomo, F.R. [Universidad de Sevilla, Dept. de Ingenieria Electronica, Sevilla (Spain); Reifarth, R. [Goethe-Universitaet Frankfurt am Main, Frankfurt am Main (Germany)

    2017-05-15

    The neutron capture cross sections of several unstable key isotopes acting as branching points in the s-process are crucial for stellar nucleosynthesis studies, but they are very challenging to measure directly due to the difficult production of sufficient sample material, the high activity of the resulting samples, and the actual (n, γ) measurement, where high neutron fluxes and effective background rejection capabilities are required. At present there are about 21 relevant s-process branching point isotopes whose cross section could not be measured yet over the neutron energy range of interest for astrophysics. However, the situation is changing with some very recent developments and upcoming technologies. This work introduces three techniques that will change the current paradigm in the field: the use of γ-ray imaging techniques in (n, γ) experiments, the production of moderated neutron beams using high-power lasers, and double capture experiments in Maxwellian neutron beams. (orig.)

  11. Measuring similarity between business process models

    NARCIS (Netherlands)

    Dongen, van B.F.; Dijkman, R.M.; Mendling, J.

    2007-01-01

    Quality aspects become increasingly important when business process modeling is used in a large-scale enterprise setting. In order to facilitate a storage without redundancy and an efficient retrieval of relevant process models in model databases it is required to develop a theoretical understanding

  12. Development of a new 3D azimuthal LWD caliper measurement

    International Nuclear Information System (INIS)

    Doghmi, M.; Ellis, D. V.

    2003-01-01

    The need for a caliper measurement from logging-while-drilling (LWD) has prompted developments in several measurement domains. Hole size measurements are mainly derived from ultrasonic and electrical devices. However, both techniques have significant limitations. This paper discusses the development of a new azimutual density- 1 derived caliper measurement. An algorithm and process were developed to use the 16 sector densities measured by LWD tool for computations of 16 boreholes radii to provide a good representation of the borehole. As with all physical measurements, some limitations exist for accurate measurement in large washouts an in heavy mud systems. However, the measurements are independent of borehole fluid resistivity and can be acquired In oil-, synthetic-, and water-based mud systems. Comparisons wire line multifinger caliper logs confirm the measurement is valid in holes with large washouts. Three-dimensional visualization software allows analysis of borehole shape to be carried out on any personal computer. Timely feedback of this information enables changes in drilling and/or mud parameters to improve borehole shape and drilling efficiency. This responsiveness should lead to better well positioning and better petrophysical data. Recent applications have demonstrated the benefits of a true three- dimensional (3D) azimuthal caliper measurement. These benefits include borehole shapes analysis for stress orientation; detection of problems such as washouts, ovalization, and spiraling; improvement of cement job design and completion strategies; and optimization of drilling parameters. These applications will be discussed and illustrated in this paper

  13. Development and preliminary validation of flux map processing code MAPLE

    International Nuclear Information System (INIS)

    Li Wenhuai; Zhang Xiangju; Dang Zhen; Chen Ming'an; Lu Haoliang; Li Jinggang; Wu Yuanbao

    2013-01-01

    The self-reliant flux map processing code MAPLE was developed by China General Nuclear Power Corporation (CGN). Weight coefficient method (WCM), polynomial expand method (PEM) and thin plane spline (TPS) method were applied to fit the deviation between measured and predicted detector signal results for two-dimensional radial plane, to interpolate or extrapolate the non-instrumented location deviation. Comparison of results in the test cases shows that the TPS method can better capture the information of curved fitting lines than the other methods. The measured flux map data of the Lingao Nuclear Power Plant were processed using MAPLE as validation test cases, combined with SMART code. Validation results show that the calculation results of MAPLE are reasonable and satisfied. (authors)

  14. Measuring quality of life in Macedonia - using human development indicators

    Directory of Open Access Journals (Sweden)

    Dimitar Eftimoski

    2006-12-01

    Full Text Available By the end of the 1980s, the central issue of development was focused on the growth of income and not on the growth of quality of life. Therefore, the development strategies were oriented towards production and left no significant space for improving the welfare of individuals.In the beginning of the 1990s, the human development concept emerged, stressing that economic development ultimately should result in growth of quality of life of individuals, while the goal of the development process was to expand the capabilities of individuals by placing them in the focus of the efforts for development.This paper if focused on the quality of life of the individuals. Moreover, in addition to the previous practice in Macedonia of calculating the human development index (HDI - as a measure of quality of life, an attempt will be made to calculate the humanpoverty index (HPI-2 - as a measure of non-income poverty, gender development index (GDI - as a measure of inequality between men and women, as well as the human development index at the level of aggregated urban and rural municipalities.We hope that it will contribute to the improvement of the quality of decisions made by the state and local authorities in Macedonia when it comes to issues concerning the human development.

  15. Design and Development of transducer for IR radiation measurement

    International Nuclear Information System (INIS)

    Pattarachindanuwong, Surat; Poopat, Bovornchoke; Meethong, Wachira

    2003-06-01

    Recently, IR radiation has many important roles such as for plastics industry, food industry and medical instrumentation. The consequence of exposed irradiation objects from IR can be greatly affected by the quantity of IR radiation. Therefore the objectively this research is to design and develop a transducer for IR radiation measurement. By using a quartz halogen lamp as a IR heat source of IR radiation and a thermopile sensor as a transducer. The thermal conductivity of transducer and air flow, were also considered for design and development of transducer. The study shows that the designed transducer can be used and applied in high temperature process, for example, the quality control of welding, the non-contact temperature measurement of drying oven and the testing of IR source in medical therapy device

  16. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate

  17. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  18. Development of synthetic gasoline production process

    Energy Technology Data Exchange (ETDEWEB)

    Imai, T; Fujita, H; Yamada, K; Suzuki, T; Tsuchida, Y

    1986-01-01

    As oil deposits are limited, it is very important to develop techniques for manufacturing petroleum alternatives as substitute energy sources to brighten the outlook for the future. The Research Association for Petroleum Alternatives Development (RAPAD) in Japan is engaged in the research and development of production techniques for light hydrocarbon oils such as gasoline, kerosene, and light oil from synthesis gas (CO, H/sub 2/) obtained from the raw materials of natural gas, coal, etc. Regarding the MTG process of synthesizing gasoline via methanol from synthesis gas and the STG process of directly synthesizing gasoline from synthesis gas, Cosmo Oil Co., Ltd. and Mitsubishi Heavy Industries, Ltd., members of RAPAD, have sought jointly to develop catalysts and processes. As a result of this co-operation, the authors have recently succeeded in developing a new catalyst with a long life span capable of providing a high yield and high selectivity. Additionally, the authors are currently on the verge of putting into effect a unique two-step STG process of synthesizing high octane gasoline via dimethyl ether, referred to as the AMSTG process.

  19. Development of measurement system for gauge block interferometer

    Science.gov (United States)

    Chomkokard, S.; Jinuntuya, N.; Wongkokua, W.

    2017-09-01

    We developed a measurement system for collecting and analyzing the fringe pattern images from a gauge block interferometer. The system was based on Raspberry Pi which is an open source system with python programming and opencv image manipulation library. The images were recorded by the Raspberry Pi camera with five-megapixel capacity. The noise of images was suppressed for the best result in analyses. The low noise images were processed to find the edge of fringe patterns using the contour technique for the phase shift analyses. We tested our system with the phase shift patterns between a gauge block and a reference plate. The phase shift patterns were measured by a Twyman-Green type of interferometer using the He-Ne laser with the temperature controlled at 20.0 °C. The results of the measurement will be presented and discussed.

  20. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  1. From theory to 'measurement' in complex interventions: Methodological lessons from the development of an e-health normalisation instrument

    Directory of Open Access Journals (Sweden)

    Finch Tracy L

    2012-05-01

    Full Text Available Abstract Background Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1 describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2 identify key issues and methodological challenges for advancing work in this field. Methods A 30-item instrument (Technology Adoption Readiness Scale (TARS for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT. NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice was used by health care professionals. Results The developed instrument was pre-tested in two professional samples (N = 46; N = 231. Ratings of items representing normalisation ‘processes’ were significantly related to staff members’ perceptions of whether or not e-health had become ‘routine’. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. Conclusions To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1 greater attention to underlying theoretical assumptions and extent of translation work required; (2 the need for appropriate but flexible approaches to outcomes

  2. Integration of Error Compensation of Coordinate Measuring Machines into Feature Measurement: Part I—Model Development

    Science.gov (United States)

    Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario

    2016-01-01

    The development of an error compensation model for coordinate measuring machines (CMMs) and its integration into feature measurement is presented. CMMs are widespread and dependable instruments in industry and laboratories for dimensional measurement. From the tip probe sensor to the machine display, there is a complex transformation of probed point coordinates through the geometrical feature model that makes the assessment of accuracy and uncertainty measurement results difficult. Therefore, error compensation is not standardized, conversely to other simpler instruments. Detailed coordinate error compensation models are generally based on CMM as a rigid-body and it requires a detailed mapping of the CMM’s behavior. In this paper a new model type of error compensation is proposed. It evaluates the error from the vectorial composition of length error by axis and its integration into the geometrical measurement model. The non-explained variability by the model is incorporated into the uncertainty budget. Model parameters are analyzed and linked to the geometrical errors and uncertainty of CMM response. Next, the outstanding measurement models of flatness, angle, and roundness are developed. The proposed models are useful for measurement improvement with easy integration into CMM signal processing, in particular in industrial environments where built-in solutions are sought. A battery of implementation tests are presented in Part II, where the experimental endorsement of the model is included. PMID:27690052

  3. Integration of Error Compensation of Coordinate Measuring Machines into Feature Measurement: Part I—Model Development

    Directory of Open Access Journals (Sweden)

    Roque Calvo

    2016-09-01

    Full Text Available The development of an error compensation model for coordinate measuring machines (CMMs and its integration into feature measurement is presented. CMMs are widespread and dependable instruments in industry and laboratories for dimensional measurement. From the tip probe sensor to the machine display, there is a complex transformation of probed point coordinates through the geometrical feature model that makes the assessment of accuracy and uncertainty measurement results difficult. Therefore, error compensation is not standardized, conversely to other simpler instruments. Detailed coordinate error compensation models are generally based on CMM as a rigid-body and it requires a detailed mapping of the CMM’s behavior. In this paper a new model type of error compensation is proposed. It evaluates the error from the vectorial composition of length error by axis and its integration into the geometrical measurement model. The non-explained variability by the model is incorporated into the uncertainty budget. Model parameters are analyzed and linked to the geometrical errors and uncertainty of CMM response. Next, the outstanding measurement models of flatness, angle, and roundness are developed. The proposed models are useful for measurement improvement with easy integration into CMM signal processing, in particular in industrial environments where built-in solutions are sought. A battery of implementation tests are presented in Part II, where the experimental endorsement of the model is included.

  4. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  5. Application of quantitative autoradiography to the measurement of biochemical processes in vivo

    International Nuclear Information System (INIS)

    Sokoloff, L.

    1985-01-01

    Quantitative autoradiography makes it possible to measure the concentrations of isotopes in tissues of animals labeled in vivo. In a few cases, the administration of a judiciously selected labeled chemical compound and a properly designed procedure has made it possible to use this capability to measure the rate of a chemical process in animals in vivo. Emission tomography, and particularly positron emission tomography, provides a means to extend this capability to man and to assay the rates of biochemical processes in human tissues in vivo. It does not, however, obviate the need to adhere to established principles of chemical and enzyme kinetics and tracer theory. Generally, all such methods, whether to be used in man with positron emission tomography or in animals with autoradiography, must first be developed by research in animals with autoradiography, because it is only in animals that the measurements needed to validate the basic assumptions of the methods can be tested and evaluated

  6. The development of application technology for image processing in nuclear facilities

    International Nuclear Information System (INIS)

    Lee, Jong Min; Lee, Yong Bum; Kim, Woog Ki; Sohn, Surg Won; Kim, Seung Ho; Hwang, Suk Yeoung; Kim, Byung Soo

    1991-01-01

    The object of this project is to develop application technology of image processing in nuclear facilities where image signal are used for reliability and safety enhancement of operation, radiation exposure reduce of operator, and automation of operation processing. We has studied such application technology for image processing in nuclear facilities as non-tactile measurement, remote and automatic inspection, remote control, and enhanced analysis of visual information. On these bases, automation system and real-time image processing system are developed. Nuclear power consists in over 50% share of electic power supply of our country nowdays. So, it is required of technological support for top-notch technology in nuclear industry and its related fields. Especially, it is indispensable for image processing technology to enhance the reliabilty and safety of operation, to automate the process in a place like a nuclear power plant and radioactive envionment. It is important that image processing technology is linked to a nuclear engineering, and enhance the reliability abd safety of nuclear operation, as well as decrease the dose rate. (Author)

  7. Radioactive Dry Process Material Treatment Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. J.; Hung, I. H.; Kim, K. K. (and others)

    2007-06-15

    The project 'Radioactive Dry Process Material Treatment Technology Development' aims to be normal operation for the experiments at DUPIC fuel development facility (DFDF) and safe operation of the facility through the technology developments such as remote operation, maintenance and pair of the facility, treatment of various high level process wastes and trapping of volatile process gases. DUPIC Fuel Development Facility (DFDF) can accommodate highly active nuclear materials, and now it is for fabrication of the oxide fuel by dry process characterizing the proliferation resistance. During the second stage from march 2005 to February 2007, we carried out technology development of the remote maintenance and the DFDF's safe operation, development of treatment technology for process off-gas, and development of treatment technology for PWR cladding hull and the results was described in this report.

  8. Automated processing of measuring information and control processes of eutrophication in water for household purpose, based on artificial neural networks

    Directory of Open Access Journals (Sweden)

    О.М. Безвесільна

    2006-04-01

    Full Text Available  The possibilities of application  informational-computer technologies for automated handling of a measuring information about development of seaweed (evtrofication in household reservoirs are considered. The input data’s for a research of processes evtrofication are videoimages of tests of water, which are used for the definition of geometric characteristics, number and biomass of seaweed. For handling a measuring information the methods of digital handling videoimages and mathematical means of artificial neural networks are offered.

  9. Software development for the RF measurement and analysis of RFQ accelerator

    International Nuclear Information System (INIS)

    Fu Shinian

    2002-01-01

    In a high current RFQ accelerator, it is required to tightly control the beam losses and beam emittance growth. For this reason, it is demanded to accurately measure and to correctly analyze field distribution and mode components, and eventually, to tune the RF field to reach its design values. LebView is a widely used software platform for the automatic measurement and data processing. The author will present the code development on this platform for the RFQ measurement and analysis, including some applications of the codes

  10. Software development for the RF measurement and analysis of RFQ accelerator

    CERN Document Server

    Fu Shin Ian

    2002-01-01

    In a high current RFQ accelerator, it is required to tightly control the beam losses and beam emittance growth. For this reason, it is demanded to accurately measure and to correctly analyze field distribution and mode components, and eventually, to tune the RF field to reach its design values. LebView is a widely used software platform for the automatic measurement and data processing. The will present the code development on this platform for the RFQ measurement and analysis, including some applications of the codes

  11. Software development for the RF measurement and analysis of RFQ accelerator

    International Nuclear Information System (INIS)

    Fu Shinian

    2002-01-01

    In a high current RFQ accelerator, it is required to tightly control the beam losses and beam emittance growth. For this reason, it is demanded to accurately measure and to correctly analyze field distribution and mode components, and eventually, to tune the RF field to reach its design values. LebView is a widely used software platform for the automatic measurement and data processing, the authors present authors' code development on this platform for the RFQ measurement and analysis, including some applications of the codes

  12. Windows base sodium liquid high-speed measuring system software development

    International Nuclear Information System (INIS)

    Kolokol'tsev, M.V.

    2005-01-01

    This work describes software creation process, that allows to realize data capture from the sodium liquid parameter measuring system, information processing and imaging in the real-time operation mode, retrieval, visualization and documentation of the information in post-startup period as well. Nonstandard decision is described: creation of high-speed data capture system, based on Windows and relatively inexpensive hardware component. Technical description (enterprise classes, interface elements) of the developed and introduced enclosures, realizing data capture and post-startup information visualization are given. (author)

  13. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    Science.gov (United States)

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  14. The MINERVA Software Development Process

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  15. Thickness measurement by using cepstrum ultrasonic signal processing

    International Nuclear Information System (INIS)

    Choi, Young Chul; Yoon, Chan Hoon; Choi, Heui Joo; Park, Jong Sun

    2014-01-01

    Ultrasonic thickness measurement is a non-destructive method to measure the local thickness of a solid element, based on the time taken for an ultrasound wave to return to the surface. When an element is very thin, it is difficult to measure thickness with the conventional ultrasonic thickness method. This is because the method measures the time delay by using the peak of a pulse, and the pulses overlap. To solve this problem, we propose a method for measuring thickness by using the power cepstrum and the minimum variance cepstrum. Because the cepstrums processing can divides the ultrasound into an impulse train and transfer function, where the period of the impulse train is the traversal time, the thickness can be measured exactly. To verify the proposed method, we performed experiments with steel and, acrylic plates of variable thickness. The conventional method is not able to estimate the thickness, because of the overlapping pulses. However, the cepstrum ultrasonic signal processing that divides a pulse into an impulse and a transfer function can measure the thickness exactly.

  16. Development of functionally-oriented technological processes of electroerosive processing

    Science.gov (United States)

    Syanov, S. Yu

    2018-03-01

    The stages of the development of functionally oriented technological processes of electroerosive processing from the separation of the surfaces of parts and their service functions to the determination of the parameters of the process of electric erosion, which will provide not only the quality parameters of the surface layer, but also the required operational properties, are described.

  17. Light measurement model to assess software development process improvement Modelo liviano de medidas para evaluar la mejora de procesos de desarrollo de software MLM-PDS

    Directory of Open Access Journals (Sweden)

    Diana Vásquez

    2010-12-01

    Full Text Available Companies in software development in Colombia face a number of problems such as the construction of software in a artesian, empirical and disorganized way. Therefore, it is necessary for these companies to implement projects to improve their development processes, because ensure the quality of products, by improving their software processes, is a step that should give to be able to compete in the market. To implement process improvement models, it is not enough to say whether a company is actually getting benefits, definitely one of the first actions in a to improvement project is to be able to determine the current status of the process. Only by measuring it is possible to know the state of a process in an objective ay, and only through this it is possible to plan strategies and solutions, about improvements to make, depending on the objectives of the organization. This paper proposes a light model to assess software development process, which seeks to help the Colombian software development companies to determine whether the process of implementing improvements, being effective in achieving the objectives and goals set to implement this, through the use of measures to evaluate the process of improving their development processes, allowing characterize the current practices of the company, identifying weaknesses, strengths and abilities of the processes that are carried out within this and thus control or prevent the causes of low quality, or deviations in costs or planning.Las empresas de desarrollo de software en Colombia enfrentan una serie de problemas tales como la construcción de software de forma artesanal, empírica y desorganizada. Por esto, es necesario que implementen proyectos para mejorar sus procesos de desarrollo, ya que asegurar la calidad de los productos,a través de la mejora de sus procesos de software, es un paso que deben dar para estar en condiciones de competir en el mercado nacional e internacional. Implementar modelos

  18. Decision Gate Process for Assessment of a Technology Development Portfolio

    Science.gov (United States)

    Kohli, Rajiv; Fishman, Julianna; Hyatt, Mark

    2012-01-01

    The NASA Dust Management Project (DMP) was established to provide technologies (to TRL 6 development level) required to address adverse effects of lunar dust to humans and to exploration systems and equipment, which will reduce life cycle cost and risk, and will increase the probability of sustainable and successful lunar missions. The technology portfolio of DMP consisted of different categories of technologies whose final product is either a technology solution in itself, or one that contributes toward a dust mitigation strategy for a particular application. A Decision Gate Process (DGP) was developed to assess and validate the achievement and priority of the dust mitigation technologies as the technologies progress through the development cycle. The DGP was part of continuous technology assessment and was a critical element of DMP risk management. At the core of the process were technology-specific criteria developed to measure the success of each DMP technology in attaining the technology readiness levels assigned to each decision gate. The DGP accounts for both categories of technologies and qualifies the technology progression from technology development tasks to application areas. The process provided opportunities to validate performance, as well as to identify non-performance in time to adjust resources and direction. This paper describes the overall philosophy of the DGP and the methodology for implementation for DMP, and describes the method for defining the technology evaluation criteria. The process is illustrated by example of an application to a specific DMP technology.

  19. The Development of Analogical Reasoning Processes.

    Science.gov (United States)

    Sternberg, Robert J.; Rifkin, Bathsheva

    1979-01-01

    Two experiments were conducted to test the generalizability to children of a theory of analogical reasoning processes, originally proposed for adults, and to examine the development of analogical reasoning processes in terms of five proposed sources of cognitive development. (MP)

  20. Development of time-resolved optical measurement and diagnostic system for parameters of high current and pulsed electron beam

    International Nuclear Information System (INIS)

    Jiang Xiaoguo; Wang Yuan; Yang Guojun; Xia Liansheng; Li Hong; Zhang Zhuo; Liao Shuqing; Shi Jinshui

    2013-01-01

    The beam parameters measurement is the most important work for the study of linear induction accelerator(LIA). The beam parameters are important to evaluate the character of the beam. The demands of beam parameters measurement are improving while the development of accelerator is improving. The measurement difficulty feature higher time-resolved ability, higher spatial resolution, larger dynamic range and higher intuitionistic view data. The measurement technology of beam spot, beam emittance, beam energy have been developed for the past several years. Some high performance equipment such as high speed framing camera are developed recently. Under this condition, the relative integrated optical measurement and diagnostic system for the beam parameters is developed based on several principles. The system features time-resolved ability of up to 2 ns, high sensitivity and large dynamic range. The processing program is compiled for the data process and the local real-time process is reached. The measurement and diagnostic system has provided full and accurate data for the debug work and has been put into applications. (authors)

  1. Development of AMS procedure for measurement of 93Zr

    Science.gov (United States)

    Lu, Wenting; Collon, Philippe; Kashiv, Yoav; Bowers, Matthew; Robertson, Daniel; Schmitt, Christopher

    2011-10-01

    The procedure for measuring 93Zr (t1/2 = 1.5 Ma) by AMS is currently being developed at the Nuclear Science Lab at the University of Notre Dame and we report on first experiments performed in this direction. AMS detection of 93Zr can potentially be applied to address astrophysical and environmental issues: (1) the measurement of the 92Zr(n,γ)93Zr reaction cross-section at nucleosynthesis s-process relevant temperatures, (2) the search for potential live 93Zr from a supernova in deep sea sediments, (3) hydrological and radioactive waste tracing. The measurement of 93Zr requires adequate separation from its stable isobar 93Nb. We are currently working on optimizing this separation by using the GasFilled Magnet technique with additional multiple dE measurements in a focal plane ionization chamber.

  2. Development of Software for Measurement and Analysis of Solar Radiation

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Abul Adli Anuar; Noor Ezati Shuib

    2015-01-01

    This software was under development using LabVIEW to be using with StellarNet spectrometers system with USB communication to computer. LabVIEW have capabilities in hardware interfacing, graphical user interfacing and mathematical calculation including array manipulation and processing. This software read data from StellarNet spectrometer in real-time and then processed for analysis. Several measurement of solar radiation and analysis have been done. Solar radiation involved mainly infra-red, visible light and ultra-violet. With solar radiation spectrum data, information of weather and suitability of plant can be gathered and analyzed. Furthermore, optimization of utilization and safety precaution of solar radiation can be planned. Using this software, more research and development in utilization and safety of solar radiation can be explored. (author)

  3. Development and validation of the social information processing application: a Web-based measure of social information processing patterns in elementary school-age boys.

    Science.gov (United States)

    Kupersmidt, Janis B; Stelter, Rebecca; Dodge, Kenneth A

    2011-12-01

    The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in 3rd through 5th grades. This study included a racially and ethnically diverse sample of 244 boys ages 8 through 12 (M = 9.4) from public elementary schools in 3 states. The SIP-AP includes 8 videotaped vignettes, filmed from the first-person perspective, that depict common misunderstandings among boys. Each vignette shows a negative outcome for the victim and ambiguous intent on the part of the perpetrator. Boys responded to 16 Web-based questions representing the 5 social information processing mechanisms, after viewing each vignette. Parents and teachers completed measures assessing boys' antisocial behavior. Confirmatory factor analyses revealed that a model positing the original 5 cognitive mechanisms fit the data well when the items representing prosocial cognitions were included on their own factor, creating a 6th factor. The internal consistencies for each of the 16 individual cognitions as well as for the 6 cognitive mechanism scales were excellent. Boys with elevated scores on 5 of the 6 cognitive mechanisms exhibited more antisocial behavior than boys whose scores were not elevated. These findings highlight the need for further research on the measurement of prosocial cognitions or cognitive strengths in boys in addition to assessing cognitive deficits. Findings suggest that the SIP-AP is a reliable and valid tool for use in future research of social information processing skills in boys.

  4. Development and evaluation of dosimeters from locally available perspex for high dose measurement in industrial radiation processing. Final report for the period December 1985 - December 1989

    International Nuclear Information System (INIS)

    Amin, R.

    1989-11-01

    The objective of the study was to find, develop and evaluate suitable low cost perspex materials to be used as routine dosemeters for high dose measurements, particularly in industrial radiation processing. Red, amber and white perspex materials of local origin were investigated for their dosimetric properties and evaluated against Harwell red perspex, Fricke and ethanol-monochlorobenzene dosemeters. 5 refs, 13 figs, 5 tabs

  5. Development and evaluation of dosimeters from locally available perspex for high dose measurement in industrial radiation processing. Final report for the period December 1985 - December 1989

    Energy Technology Data Exchange (ETDEWEB)

    Amin, R [Atomic Energy Research Establishment, Dhaka (Bangladesh). Inst. of Food and Radiation Biology

    1989-11-01

    The objective of the study was to find, develop and evaluate suitable low cost perspex materials to be used as routine dosemeters for high dose measurements, particularly in industrial radiation processing. Red, amber and white perspex materials of local origin were investigated for their dosimetric properties and evaluated against Harwell red perspex, Fricke and ethanol-monochlorobenzene dosemeters. 5 refs, 13 figs, 5 tabs.

  6. Active chatter suppression with displacement-only measurement in turning process

    Science.gov (United States)

    Ma, Haifeng; Wu, Jianhua; Yang, Liuqing; Xiong, Zhenhua

    2017-08-01

    Regenerative chatter is a major hindrance for achieving high quality and high production rate in machining processes. Various active controllers have been proposed to mitigate chatter. However, most of existing controllers were developed on the basis of multi-states feedback of the system and state observers were usually needed. Moreover, model parameters of the machining process (mass, damping and stiffness) were required in existing active controllers. In this study, an active sliding mode controller, which employs a dynamic output feedback sliding surface for the unmatched condition and an adaptive law for disturbance estimation, is designed, analyzed, and validated for chatter suppression in turning process. Only displacement measurement is required by this approach. Other sensors and state observers are not needed. Moreover, it facilitates a rapid implementation since the designed controller is established without using model parameters of the turning process. Theoretical analysis, numerical simulations and experiments on a computer numerical control (CNC) lathe are presented. It shows that the chatter can be substantially attenuated and the chatter-free region can be significantly expanded with the presented method.

  7. Research and development project of regional consortiums in fiscal 1998. Research and development of regional consortium energy (development of measuring technology to aid energy conservation in electronic device manufacturing processes (design and trial production of IMI) (Report on the result in the first year)); 1998 nendo chiiki consortium energy kenkyu kaihatsu. Denshi kikirui seizo process no sho energy shien keisoku seigyo gijutsu no kaihatsu (IMI no sekkei to shisaku) (dai 1 nendo)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This paper summarizes the development of intelligent micro instruments (IMI) inaugurated in fiscal 1998 as the wide-area consortium project for the Tama area. Research and development will be carried out on the following items: IMI substrate elements utilizing micro machining technology, applicable to micro sensors and micro probes, semiconductor process sensors, electronic device measuring probes, signal processing and communication circuits for wireless sensing. This paper describes the achievements during fiscal 1998. Technologies were transferred from the Mechanical Engineering Laboratory of the Agency of Industrial Science and Technology on silicon micro machining and PZT piezoelectric thin film formation. An IMI research laboratory was installed at the Tokyo Metropolitan University. In developing the IMI substrate elements, different beams applicable to sensors and probes were fabricated on a trial basis, and their mechanical properties were measured. For the semiconductor process sensors, discussions were given on micronization on a chlorine ion analyzer. In developing the electronic device measuring probes, the target was placed on measurement of in-situ characteristics of IC chips on a wafer. A prototype transmitting and receiving circuit board was fabricated for developing the wireless sensing. (NEDO)

  8. Evaluation and development plan of NRTA measurement methods for the Rokkasho Reprocessing Plant

    International Nuclear Information System (INIS)

    Li, T.K.; Hakkila, E.A.; Flosterbuer, S.F.

    1995-01-01

    Near-real-time accounting (NRTA) has been proposed as a safeguards method at the Rokkasho Reprocessing Plant (RRP), a large-scale commercial boiling water and pressurized water reactors spent-fuel reprocessing facility. NRTA for RRP requires material balance closures every month. To develop a more effective and practical NRTA system for RRP, we have evaluated NRTA measurement techniques and systems that might be implemented in both the main process and the co-denitration process areas at RRP to analyze the concentrations of plutonium in solutions and mixed oxide powder. Based on the comparative evaluation, including performance, reliability, design criteria, operation methods, maintenance requirements, and estimated costs for each possible measurement method, recommendations for development were formulated. This paper discusses the evaluations and reports on the recommendation of the NRTA development plan for potential implementation at RRP

  9. Measuring attitudes towards the dying process: A systematic review of tools.

    Science.gov (United States)

    Groebe, Bernadette; Strupp, Julia; Eisenmann, Yvonne; Schmidt, Holger; Schlomann, Anna; Rietz, Christian; Voltz, Raymond

    2018-04-01

    At the end of life, anxious attitudes concerning the dying process are common in patients in Palliative Care. Measurement tools can identify vulnerabilities, resources and the need for subsequent treatment to relieve suffering and support well-being. To systematically review available tools measuring attitudes towards dying, their operationalization, the method of measurement and the methodological quality including generalizability to different contexts. Systematic review according to the PRISMA Statement. Methodological quality of tools assessed by standardized review criteria. MEDLINE, PsycINFO, PsyndexTests and the Health and Psychosocial Instruments were searched from their inception to April 2017. A total of 94 identified studies reported the development and/or validation of 44 tools. Of these, 37 were questionnaires and 7 alternative measurement methods (e.g. projective measures). In 34 of 37 questionnaires, the emotional evaluation (e.g. anxiety) towards dying is measured. Dying is operationalized in general items ( n = 20), in several specific aspects of dying ( n = 34) and as dying of others ( n = 14). Methodological quality of tools was reported inconsistently. Nine tools reported good internal consistency. Of 37 tools, 4 were validated in a clinical sample (e.g. terminal cancer; Huntington disease), indicating questionable generalizability to clinical contexts for most tools. Many tools exist to measure attitudes towards the dying process using different endpoints. This overview can serve as decision framework on which tool to apply in which contexts. For clinical application, only few tools were available. Further validation of existing tools and potential alternative methods in various populations is needed.

  10. Understanding flexible and distributed software development processes

    OpenAIRE

    Agerfalk, Par J.; Fitzgerald, Brian

    2006-01-01

    peer-reviewed The minitrack on Flexible and Distributed Software Development Processes addresses two important and partially intertwined current themes in software development: process flexibility and globally distributed software development

  11. Unified Approach in the DSS Development Process

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The structure of today's decision support environment become very complex due to new generation of Business Intelligence applications and technologies like Data Warehouse, OLAP (On Line Analytical Processing and Data Mining. In this respect DSS development process are not simple and needs an adequate methodology or framework able to manage different tools and platforms to achieve manager's requirements. The DSS development process must be view like a unified and iterative set of activities and operations. The new techniques based on Unified Process (UP methodology and UML (Unified Modeling Language it seems to be appropriate for DSS development using prototyping and RAD (Rapid Application Development techniques. In this paper we present a conceptual framework for development and integrate Decision Support Systems using Unified Process Methodology and UML.

  12. A methodology for development of biocatalytic processes

    DEFF Research Database (Denmark)

    Lima Ramos, Joana

    are available. The first case study presents a rational approach for defining a development strategy for multi-enzymatic processes. The proposed methodology requires a profound and structured knowledge of the multi-enzyme systems, integrating chemistry, biological and process engineering. In order to suggest......). These process metrics can often be attained by improvements in the reaction chemistry, the biocatalyst, and/or by process engineering, which often requires a complex process development strategy. Interestingly this complexity, which arises from the need for integration of biological and process technologies...... and their relationship with the overall process is not clear.The work described in this thesis presents a methodological approach for early stage development of biocatalytic processes, understanding and dealing with the reaction, biocatalyst and process constraints. When applied, this methodology has a decisive role...

  13. Development of a new flux map processing code for moveable detector system in PWR

    Energy Technology Data Exchange (ETDEWEB)

    Li, W.; Lu, H.; Li, J.; Dang, Z.; Zhang, X. [China Nuclear Power Technology Research Institute, 47 F/A Jiangsu Bldg., Yitian Road, Futian District, Shenzhen 518026 (China); Wu, Y.; Fan, X. [Information Technology Center, China Guangdong Nuclear Power Group, Shenzhen 518000 (China)

    2013-07-01

    This paper presents an introduction to the development of the flux map processing code MAPLE developed by China Nuclear Power Technology Research Institute (CNPPJ), China Guangdong Nuclear Power Group (CGN). The method to get the three-dimensional 'measured' power distribution according to measurement signal has also been described. Three methods, namely, Weight Coefficient Method (WCM), Polynomial Expand Method (PEM) and Thin Plane Spline (TPS) method, have been applied to fit the deviation between measured and predicted results for two-dimensional radial plane. The measured flux map data of the LINGAO nuclear power plant (NPP) is processed using MAPLE as a test case to compare the effectiveness of the three methods, combined with a 3D neutronics code COCO. Assembly power distribution results show that MAPLE results are reasonable and satisfied. More verification and validation of the MAPLE code will be carried out in future. (authors)

  14. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    Science.gov (United States)

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  15. Onboard Optical Navigation Measurement Processing in GEONS

    Data.gov (United States)

    National Aeronautics and Space Administration — Optical Navigation (OpNav) measurements derived from spacecraft-based images are a powerful data type in the precision orbit determination process.  OpNav...

  16. EUV mask process specifics and development challenges

    Science.gov (United States)

    Nesladek, Pavel

    2014-07-01

    EUV lithography is currently the favorite and most promising candidate among the next generation lithography (NGL) technologies. Decade ago the NGL was supposed to be used for 45 nm technology node. Due to introduction of immersion 193nm lithography, double/triple patterning and further techniques, the 193 nm lithography capabilities was greatly improved, so it is expected to be used successfully depending on business decision of the end user down to 10 nm logic. Subsequent technology node will require EUV or DSA alternative technology. Manufacturing and especially process development for EUV technology requires significant number of unique processes, in several cases performed at dedicated tools. Currently several of these tools as e.g. EUV AIMS or actinic reflectometer are not available on site yet. The process development is done using external services /tools with impact on the single unit process development timeline and the uncertainty of the process performance estimation, therefore compromises in process development, caused by assumption about similarities between optical and EUV mask made in experiment planning and omitting of tests are further reasons for challenges to unit process development. Increased defect risk and uncertainty in process qualification are just two examples, which can impact mask quality / process development. The aim of this paper is to identify critical aspects of the EUV mask manufacturing with respect to defects on the mask with focus on mask cleaning and defect repair and discuss the impact of the EUV specific requirements on the experiments needed.

  17. Development of process data capturing, analysis and controlling for thermal spray techniques - SprayTracker

    Science.gov (United States)

    Kelber, C.; Marke, S.; Trommler, U.; Rupprecht, C.; Weis, S.

    2017-03-01

    Thermal spraying processes are becoming increasingly important in high-technology areas, such as automotive engineering and medical technology. The method offers the advantage of a local layer application with different materials and high deposition rates. Challenges in the application of thermal spraying result from the complex interaction of different influencing variables, which can be attributed to the properties of different materials, operating equipment supply, electrical parameters, flow mechanics, plasma physics and automation. In addition, spraying systems are subject to constant wear. Due to the process specification and the high demands on the produced coatings, innovative quality assurance tools are necessary. A central aspect, which has not yet been considered, is the data management in relation to the present measured variables, in particular the spraying system, the handling system, working safety devices and additional measuring sensors. Both the recording of all process-characterizing variables, their linking and evaluation as well as the use of the data for the active process control presuppose a novel, innovative control system (hardware and software) that was to be developed within the scope of the research project. In addition, new measurement methods and sensors are to be developed and qualified in order to improve the process reliability of thermal spraying.

  18. Features of the Manufacturing Vision Development Process

    DEFF Research Database (Denmark)

    Dukovska-Popovska, Iskra; Riis, Jens Ove; Boer, Harry

    2005-01-01

    of action research. The methodology recommends wide participation of people from different hierarchical and functional positions, who engage in a relatively short, playful and creative process and come up with a vision (concept) for the future manufacturing system in the company. Based on three case studies......This paper discusses the key features of the process of Manufacturing Vision Development, a process that enables companies to develop their future manufacturing concept. The basis for the process is a generic five-phase methodology (Riis and Johansen, 2003) developed as a result of ten years...... of companies going through the initial phases of the methodology, this research identified the key features of the Manufacturing Vision Development process. The paper elaborates the key features by defining them, discussing how and when they can appear, and how they influence the process....

  19. Adapting the unified software development process for user interface development

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2006-01-01

    In this paper we describe how existing software developing processes, such as Rational Unified Process, can be adapted in order to allow disciplined and more efficient development of user interfaces. The main objective of this paper is to demonstrate that standard modeling environments, based on the

  20. Bisimulation on Markov Processes over Arbitrary Measurable Spaces

    DEFF Research Database (Denmark)

    Bacci, Giorgio; Bacci, Giovanni; Larsen, Kim Guldstrand

    2014-01-01

    We introduce a notion of bisimulation on labelled Markov Processes over generic measurable spaces in terms of arbitrary binary relations. Our notion of bisimulation is proven to coincide with the coalgebraic definition of Aczel and Mendler in terms of the Giry functor, which associates with a mea......We introduce a notion of bisimulation on labelled Markov Processes over generic measurable spaces in terms of arbitrary binary relations. Our notion of bisimulation is proven to coincide with the coalgebraic definition of Aczel and Mendler in terms of the Giry functor, which associates...

  1. FY 2000 report on the results of the research and development project for the photon-aided instrumentation and processing technologies. Development of high-efficiency production process technologies; 2000 nendo photon keisoku kako gijutsu seika hokokusho. Kokoritsu seisan process gijutsu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Described herein are the FY 2000 results of development of the photon-aided instrumentation and processing technologies, aimed at improving efficiency of the production processes which have been massively consuming energy, e.g., those for welding, joining, surface treatment and granulation for producing fine particles. The program for production of the functional composite compounds by the microscopic processing technologies prepares the electrically resistant films and dielectric films by in-situ mixing two types of the ultrafine particles. The program for the in-situ measuring technology is aimed at measuring contents of the constituent components of fine particles, 30nm or less in size, to an accuracy of 10% by the emission spectroscopy, after making them plasmatic. The program for the high-power, all-solid-state laser technology is developing the excited chamber for the high-power, all-solid-state slab type laser, in order to realize the energy-efficient laser-aided processing. The program for the tightly-focusing, all-solid-state laser technology develops the highly uniform crystals by growing the GLBO crystals for producing the high-power ultraviolet laser beams, is developing the techniques for production of the wavelength converting elements, including the GLBO crystal package, and develops the wavelength conversion method by the fourth-harmonic generation with the all-solid-state laser beams as the fundamental wave, realizing the high harmonic power of 23W, for generating the high-power ultraviolet laser beams at a high efficiency. (NEDO)

  2. Basic Auditory Processing Skills and Phonological Awareness in Low-IQ Readers and Typically Developing Controls

    Science.gov (United States)

    Kuppen, Sarah; Huss, Martina; Fosker, Tim; Fegan, Natasha; Goswami, Usha

    2011-01-01

    We explore the relationships between basic auditory processing, phonological awareness, vocabulary, and word reading in a sample of 95 children, 55 typically developing children, and 40 children with low IQ. All children received nonspeech auditory processing tasks, phonological processing and literacy measures, and a receptive vocabulary task.…

  3. Development of A Low-Cost FPGA-Based Measurement System for Real-Time Processing of Acoustic Emission Data: Proof of Concept Using Control of Pulsed Laser Ablation in Liquids.

    Science.gov (United States)

    Wirtz, Sebastian F; Cunha, Adauto P A; Labusch, Marc; Marzun, Galina; Barcikowski, Stephan; Söffker, Dirk

    2018-06-01

    Today, the demand for continuous monitoring of valuable or safety critical equipment is increasing in many industrial applications due to safety and economical requirements. Therefore, reliable in-situ measurement techniques are required for instance in Structural Health Monitoring (SHM) as well as process monitoring and control. Here, current challenges are related to the processing of sensor data with a high data rate and low latency. In particular, measurement and analyses of Acoustic Emission (AE) are widely used for passive, in-situ inspection. Advantages of AE are related to its sensitivity to different micro-mechanical mechanisms on the material level. However, online processing of AE waveforms is computationally demanding. The related equipment is typically bulky, expensive, and not well suited for permanent installation. The contribution of this paper is the development of a Field Programmable Gate Array (FPGA)-based measurement system using ZedBoard devlopment kit with Zynq-7000 system on chip for embedded implementation of suitable online processing algorithms. This platform comprises a dual-core Advanced Reduced Instruction Set Computer Machine (ARM) architecture running a Linux operating system and FPGA fabric. A FPGA-based hardware implementation of the discrete wavelet transform is realized to accelerate processing the AE measurements. Key features of the system are low cost, small form factor, and low energy consumption, which makes it suitable to serve as field-deployed measurement and control device. For verification of the functionality, a novel automatically realized adjustment of the working distance during pulsed laser ablation in liquids is established as an example. A sample rate of 5 MHz is achieved at 16 bit resolution.

  4. Safety guides development process in Spain

    International Nuclear Information System (INIS)

    Butragueno, J.L.; Perello, M.

    1979-01-01

    Safety guides have become a major factor in the licensing process of nuclear power plants and related nuclear facilities of the fuel cycle. As far as the experience corroborates better and better engineering methodologies and procedures, the results of these are settled down in form of standards, guides, and similar issues. This paper presents the actual Spanish experience in nuclear standards and safety guides development. The process to develop a standard or safety guide is shown. Up to date list of issued and on development nuclear safety guides is included and comments on the future role of nuclear standards in the licensing process are made. (author)

  5. Facial Emotion and Identity Processing Development in 5- to 15-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Patrick eJohnston

    2011-02-01

    Full Text Available Most developmental studies of emotional face processing to date have focussed on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were developed (i.e., facial emotion matching, facial identity matching and butterfly wing matching to include stimuli of similar level of discriminability and to be equated for task difficulty in earlier samples of young adults. Ninety two children aged 5 to 15 years and a new group of 24 young adults completed these three matching tasks. Young children were highly adept at the butterfly wing task relative to their performance on both face-related tasks. More importantly, in older children, development of facial emotion discrimination ability lagged behind that of facial identity discrimination.

  6. Robot development for nuclear material processing

    International Nuclear Information System (INIS)

    Pedrotti, L.R.; Armantrout, G.A.; Allen, D.C.; Sievers, R.H. Sr.

    1991-07-01

    The Department of Energy is seeking to modernize its special nuclear material (SNM) production facilities and concurrently reduce radiation exposures and process and incidental radioactive waste generated. As part of this program, Lawrence Livermore National Laboratory (LLNL) lead team is developing and adapting generic and specific applications of commercial robotic technologies to SNM pyrochemical processing and other operations. A working gantry robot within a sealed processing glove box and a telerobot control test bed are manifestations of this effort. This paper describes the development challenges and progress in adapting processing, robotic, and nuclear safety technologies to the application. 3 figs

  7. Application of computer picture processing to dynamic strain measurement under electromagnetic field

    International Nuclear Information System (INIS)

    Yagawa, G.; Soneda, N.

    1987-01-01

    For the structural design of fusion reactors, it is very important to ensure the structural integrity of components under various dynamic loading conditions due to a solid-electromagnetic field interaction, an earthquake, MHD effects and so on. As one of the experimental approaches to assess the dynamic fracture, we consider the strain measurement near a crack tip under a transient electromagnetic field, which in general involves several experimental difficulties. The authors have developed a strain measurement method using a picture processing technique. In this method, locations of marks printed on a surface of specimen are determined by the picture processing. The displacement field is interpolated using the mark displacements and finite elements. Finally the strain distribution is calculated by differentiating the displacement field. In the present study, the method is improved and automated apply to the measurement of dynamic strain distribution under an electromagnetic field. Then the effects of dynamic loading on the strain distribution are investigated by comparing the dynamic results with the static ones. (orig./GL)

  8. Development and Dissemination of the El Centro Health Disparities Measures Library.

    Science.gov (United States)

    Mitrani, Victoria Behar; O'Day, Joanne E; Norris, Timothy B; Adebayo, Oluwamuyiwa Winifred

    2017-09-01

    This report describes the development and dissemination of a library of English measures, with Spanish translations, on constructs relevant to social determinants of health and behavioral health outcomes. The El Centro Measures Library is a product of the Center of Excellence for Health Disparities Research: El Centro, a program funded by the National Institute on Minority Health and Health Disparities of the U.S. National Institutes of Health. The library is aimed at enhancing capacity for minority health and health disparities research, particularly for Hispanics living in the United States and abroad. The open-access library of measures (available through www.miami.edu/sonhs/measureslibrary) contains brief descriptions of each measure, scoring information (where available), links to related peer-reviewed articles, and measure items in both languages. Links to measure websites where commercially available measures can be purchased are included, as is contact information for measures that require author permission. Links to several other measures libraries are hosted on the library website. Other researchers may contribute to the library. El Centro investigators began the library by electing to use a common set of measures across studies to assess demographic information, culture-related variables, proximal outcomes of interest, and major outcomes. The collection was expanded to include other health disparity research studies. In 2012, a formal process was developed to organize, expand, and centralize the library in preparation for a gradual process of dissemination to the national and international community of researchers. The library currently contains 61 measures encompassing 12 categories of constructs. Thus far, the library has been accessed 8,883 times (unique page views as generated by Google Analytics), and responses from constituencies of users and measure authors have been favorable. With the paucity of availability and accessibility of translated

  9. Measurements of scattering processes in negative ion-atom collisions

    International Nuclear Information System (INIS)

    Kvale, T.J.

    1992-01-01

    This Technical Progress Report describes the progress made on the research objectives during the past twelve months. This research project is designed to provide measurements of various scattering processes which occur in H - collisions with atomic (specifically, noble gas and atomic hydrogen) targets at intermediate energies. These processes include: elastic scattering,single- and double-electron detachment, and target excitation/ionization. For the elastic and target inelastic processes where H - is scattered intact, the experimental technique of Ion Energy-Loss Spectroscopy (IELS) will be employed to identify the final target state(s). In most of the above processes, cross sections are unknown both experimentally and theoretically. The measurements in progress will provide either experimentally-determined cross sections or set upper limits to those cross sections. In either case, these measurements will be stringent tests of our understanding in energetic negative ion-atom collisions. This series of experiments required the construction of a new facility and the initial ion beam was accelerated through the apparatus in April 1991

  10. Development of Urban Driving Cycle with GPS Data Post Processing

    Directory of Open Access Journals (Sweden)

    Peter Lipar

    2016-08-01

    Full Text Available This paper presents GIS-based methodology for urban area driving cycle construction. The approach reaches beyond the frames of usual driving cycle development methods and takes into account another perspective of data collection. Rather than planning data collection, the approach is based on available in-vehicle measurement data post processing using Geographic Information Systems to manipulate the excessive database and extract only the representative and geographically limited individual trip data. With such data post processing the data was carefully adjusted to include only the data that describe representative driving in Ljubljana urban area. The selected method for the driving cycle development is based on searching for the best microtrips combination while minimizing the difference between two vectors; one based on generated cycle and the other on the database. Accounting for a large random sample of actual trip data, our approach enables more representative area-specific driving cycle development than the previously used techniques.

  11. Process Fragment Libraries for Easier and Faster Development of Process-based Applications

    Directory of Open Access Journals (Sweden)

    David Schumm

    2011-01-01

    Full Text Available The term “process fragment” is recently gaining momentum in business process management research. We understand a process fragment as a connected and reusable process structure, which has relaxed completeness and consistency criteria compared to executable processes. We claim that process fragments allow for an easier and faster development of process-based applications. As evidence to this claim we present a process fragment concept and show a sample collection of concrete, real-world process fragments. We present advanced application scenarios for using such fragments in development of process-based applications. Process fragments are typically managed in a repository, forming a process fragment library. On top of a process fragment library from previous work, we discuss the potential impact of using process fragment libraries in cross-enterprise collaboration and application integration.

  12. Impact of informal institutions on the development integration processes

    Directory of Open Access Journals (Sweden)

    Sidorova Alexandra, M.

    2015-06-01

    Full Text Available The paper deals with the impact of informal institutions on the definition of the vector integration processes and the development of integration processes in the countries of the Customs Union and Ukraine. The degree of scientific development of the phenomenon in different economic schools is determined in this article. Economic mentality is a basic informal institutions, which determines the degree of effectiveness of the integration processes. This paper examines the nature, characteristics and effects of economic mentality on the economic activities of people. Ethnometrichal method allows to quantify the economic mentality that enables deeper understanding and analysis of the formation and functioning of political and economic system, especially business and management, establishing contacts with other cultures. It was measured modern Belarusian economic mentality based on international methodology Hofstede and compared with the economic mentality of Russia, Ukraine and Kazakhstan. With the help of cluster analysis congruence economic mentality of the Customs Union and Ukraine was determined. Economic mentality of these countries was also compared with the economic mentality of other countries in order to identify the main types of economic culture.

  13. Tolerance analysis in manufacturing using process capability ratio with measurement uncertainty

    DEFF Research Database (Denmark)

    Mahshid, Rasoul; Mansourvar, Zahra; Hansen, Hans Nørgaard

    2017-01-01

    . In this paper, a new statistical analysis was applied to manufactured products to assess achieved tolerances when the process is known while using capability ratio and expanded uncertainty. The analysis has benefits for process planning, determining actual precision limits, process optimization, troubleshoot......Tolerance analysis provides valuable information regarding performance of manufacturing process. It allows determining the maximum possible variation of a quality feature in production. Previous researches have focused on application of tolerance analysis to the design of mechanical assemblies...... malfunctioning existing part. The capability measure is based on a number of measurements performed on part’s quality variable. Since the ratio relies on measurements, elimination of any possible error has notable negative impact on results. Therefore, measurement uncertainty was used in combination with process...

  14. Field nondestructive assay measurements as applied to process inventories

    International Nuclear Information System (INIS)

    Westsik, G.A.

    1979-08-01

    An annual process equipment holdup inventory measurement program for a plutonium processing plant was instituted by Rockwell Hanford Operations (Rockwell) at Richland, Washington. The inventories, performed in 1977 and 1978, were designed to improve plutonium accountability and control. The inventory method used field nondestructive assay (NDA) measurement techniques with portable electronics and sodium iodide detectors. Access to and movement of plutonium in work areas was curtailed during the inventory process using administrative controls. Comparison of the two annual inventories showed good reproducibility of results within the calculated error ranges. For items where no plutonium movement occurred and which contained greater than 20 grams plutonium, the average measurement difference between the two inventories was 22%. The procedures and equipment used and the operational experience from the inventories are described

  15. Uranium enrichment measurement by X- and γ-ray spectrometry with the 'URADOS' process

    International Nuclear Information System (INIS)

    Morel, Jean; Etcheverry, Michel; Riazuelo, Gilles

    1998-01-01

    The methods used for the uranium enrichment measurement require in general prior instrument calibration with several standards. Thus, it is possible to avoid the constraints involved in calibration by considering the complex spectral region called XK α . This spectral region is sufficiently limited so that the variation of the detector efficiency response is small enough to facilitate a self-calibration. Processing this region is critical and requires taking into account 3 elemental images, one corresponding to 235 U, one to 238 U and one to the X-ray fluorescence induced in the sample by radiation above 100 keV. A process called 'URADOS' based on this principle has been developed. Six uranium oxide standards with different enrichments and infinite thicknesses were counted several times to test this process; other samples, some highly enriched, were also used. The results obtained are compared to the declared values. From these measurements, it has been possible to improve the photon emission probability values

  16. Risk perception and information processing: the development and validation of a questionnaire to assess self-reported information processing.

    Science.gov (United States)

    Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K

    2012-01-01

    The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.

  17. Inclusive measurements on diffractive processes in ep collisions

    International Nuclear Information System (INIS)

    Janssen, Xavier

    2007-01-01

    Measurements from the H1 and ZEUS collaborations of the diffractive deep-inelastic scattering process, ep → eXY, where Y is a proton or a low mass proton excitation, are presented for photon virtualities in the range 2.2 2 2 and squared four-momentum transfer at the proton vertex satisfying | t | 2 . Diffractive parton distribution functions and their uncertainties are determined from a next-to-leading order DGLAP QCD analysis. Combining measurements of the inclusive diffractive deep-inelastic scattering process with an analysis of diffractive di jet production allows a very sensitive determination of both quark and gluon distributions. (author)

  18. In-process hold-up as a measure of safeguards significance

    International Nuclear Information System (INIS)

    Hamlin, A.G.

    1983-01-01

    This paper examines the use of the in-process hold-up itself, as a measure of safeguards significance. It is argued that for any process plant it is possible to define design limits for in-process hold-up, outside which the plant will not operate, or will operate in a detectably abnormal manner. It follows, therefore, that if the in-process hold-up can be derived at frequent intervals by input/output analysis from the start of the campaign, the only diversion that can be made from it during that campaign is limited to the quantity necessary to move the apparent in-process hold-up from its normal operating condition to the upper limiting condition. It also follows that detection of this diversion is as positive for protracted diversion as for abrupt diversion. If that part of the in-process inventory that is only measurable by input/output analysis has an upper operating limit that differs from its normal operating limit by less than a significant safeguards quantity of the material in question, the IAEA's criteria for both quantity and timeliness can be met by a combination of input/output analysis to determine in-process hold-up during the campaign, together with a material balance over the campaign. The paper examines the possibility of applying this measure to process plants in general, discusses means of minimizing the in-process inventory that must be determined by input/output analysis, and the performance required of the input and output analysis. It concludes that with current precision of measurement and with one input and one output batch per day, each measured, the method would be satisfactory for a campaign lasting nearly a year and involving 6 tonnes of plutonium. The paper examines the considerable advantages in verification that would arise from limiting safeguards analyses to the two points of input and output. (author)

  19. New product development processes for ICT-for-development projects

    CSIR Research Space (South Africa)

    McAlister, BN

    2012-08-01

    Full Text Available in developing regions of the world is increasing rapidly. A number of methods and practices have been used by organizations to develop and deliver such ICT solutions, but a need exists to formalize product development processes for use in the ICT...

  20. Measurement techniques in dry-powdered processing of spent nuclear fuels

    International Nuclear Information System (INIS)

    Bowers, D. L.; Hong, J.-S.; Kim, H.-D.; Persiani, P. J.; Wolf, S. F.

    1999-01-01

    High-performance liquid chromatography (HPLC) with inductively coupled plasma mass spectrometry (ICPMS) detection, α-spectrometry (α-S), and γ-spectrometry (γ-S) were used for the determination of nuclide content in five samples excised from a high-burnup fuel rod taken from a pressurized water reactor (PWR). The samples were prepared for analysis by dissolution of dry-powdered samples. The measurement techniques required no separation of the plutonium, uranium, and fission products. The sample preparation and analysis techniques showed promise for in-line analysis of highly-irradiated spent fuels in a dry-powdered process. The analytical results allowed the determination of fuel burnup based on 148 Nd, Pu, and U content. A goal of this effort is to develop the HPLC-ICPMS method for direct fissile material accountancy in the dry-powdered processing of spent nuclear fuel

  1. Developing a TPACK measurement instrument for 21st century pre-service teachers

    Directory of Open Access Journals (Sweden)

    Teemu Valtonen

    2015-11-01

    Full Text Available  Future skills, so-called 21st century skills, emphasise collaboration, creativity, critical thinking, problem-solving and especially ICT skills (Voogt & Roblin, 2012. Teachers have to be able to use various pedagogical approaches and ICT in order to support the development of their students’ 21st century skills (Voogt & Roblin, 2012. These skills, particularly ICT skills, pose challenges for teachers and teacher education. This paper focuses on developing an instrument for measuring pre-service teachers’ knowledge related to ICT in the context of 21st century skills.Technological Pedagogical Content Knowledge (TPACK; Mishra & Kohler, 2006 was used as a theoretical framework for designing the instrument. While the TPACK framework is actively used, the instruments used to measure it have proven challenging. This paper outlines the results of the development process of the TPACK-21 instrument. A new assessment instrument was compiled and tested on pre-service teachers in Study1 (N=94. Based on these results, the instrument was further developed and tested in Study2 (N=267. The data of both studies were analysed using multiple quantitative methods in order to evaluate the psychometric properties of the instruments. The results provide insight into the challenges of the development process itself and also suggest new solutions to overcome these difficulties.

  2. Guidelines for competency development and measurement in rehabilitation psychology postdoctoral training.

    Science.gov (United States)

    Stiers, William; Barisa, Mark; Stucky, Kirk; Pawlowski, Carey; Van Tubbergen, Marie; Turner, Aaron P; Hibbard, Mary; Caplan, Bruce

    2015-05-01

    This study describes the results of a multidisciplinary conference (the Baltimore Conference) that met to develop consensus guidelines for competency specification and measurement in postdoctoral training in rehabilitation psychology. Forty-six conference participants were chosen to include representatives of rehabilitation psychology training and practice communities, representatives of psychology accreditation and certification bodies, persons involved in medical education practice and research, and consumers of training programs (students). Consensus education and training guidelines were developed that specify the key competencies in rehabilitation psychology postdoctoral training, and structured observation checklists were developed for their measurement. This study continues the development of more than 50 years of thinking about education and training in rehabilitation psychology and builds on the existing work to further advance the development of guidelines in this area. The conference developed aspirational guidelines for competency specification and measurement in rehabilitation psychology postdoctoral training (i.e., for studying the outcomes of these training programs). Structured observation of trainee competencies allows examination of actual training outcomes in relation to intended outcomes and provides a methodology for studying how program outcomes are related to program structures and processes so that program improvement can occur. Best practices in applying program evaluation research methods to the study of professional training programs are discussed. (c) 2015 APA, all rights reserved).

  3. Self-Calibrated In-Process Photogrammetry for Large Raw Part Measurement and Alignment before Machining.

    Science.gov (United States)

    Mendikute, Alberto; Yagüe-Fabra, José A; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai

    2017-09-09

    Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g. 0.1 mm error in 1 m) with

  4. Process Skill Assessment Instrument: Innovation to measure student’s learning result holistically

    Science.gov (United States)

    Azizah, K. N.; Ibrahim, M.; Widodo, W.

    2018-01-01

    Science process skills (SPS) are very important skills for students. However, the fact that SPS is not being main concern in the primary school learning is undeniable. This research aimed to develop a valid, practical, and effective assessment instrument to measure student’s SPS. Assessment instruments comprise of worksheet and test. This development research used one group pre-test post-test design. Data were obtained with validation, observation, and test method to investigate validity, practicality, and the effectivenss of the instruments. Results showed that the validity of assessment instruments is very valid, the reliability is categorized as reliable, student SPS activities have a high percentage, and there is significant improvement on student’s SPS score. It can be concluded that assessment instruments of SPS are valid, practical, and effective to be used to measure student’s SPS result.

  5. Development an Instrument to Measure University Students' Attitude towards E-Learning

    Science.gov (United States)

    Mehra, Vandana; Omidian, Faranak

    2012-01-01

    The study of student's attitude towards e-learning can in many ways help managers better prepare in light of e-learning for the future. This article describes the process of the development of an instrument to measure university students' attitude towards e-learning. The scale was administered to 200 University students from two countries (India…

  6. A Measurable Model of the Creative Process in the Context of a Learning Process

    Science.gov (United States)

    Ma, Min; Van Oystaeyen, Fred

    2016-01-01

    The authors' aim was to arrive at a measurable model of the creative process by putting creativity in the context of a learning process. The authors aimed to provide a rather detailed description of how creative thinking fits in a general description of the learning process without trying to go into an analysis of a biological description of the…

  7. Development of interface technology between unit processes in E-Refining process

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S. H.; Lee, H. S.; Kim, J. G. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    The pyroprocessing is composed mainly four subprocesses, such as an electrolytic reduction, an electrorefining, an electrowinning, and waste salt regeneration/ solidification processes. The electrorefining process, one of main processes which are composed of pyroprocess to recover the useful elements from spent fuel, is under development by Korea Atomic Energy Research Institute as a sub process of pyrochemical treatment of spent PWR fuel. The CERS(Continuous ElectroRefining System) is composed of some unit processes such as an electrorefiner, a salt distiller, a melting furnace for the U-ingot and U-chlorinator (UCl{sub 3} making equipment) as shown in Fig. 1. In this study, the interfaces technology between unit processes in E-Refining system is investigated and developed for the establishment of integrated E-Refining operation system as a part of integrated pyroprocessing

  8. Campbell and moment measures for finite sequential spatial processes

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette)

    2006-01-01

    textabstractWe define moment and Campbell measures for sequential spatial processes, prove a Campbell-Mecke theorem, and relate the results to their counterparts in the theory of point processes. In particular, we show that any finite sequential spatial process model can be derived as the vector

  9. Measurement of the total proANP product in mammals by processing independent analysis

    DEFF Research Database (Denmark)

    Hunter, Ingrid; Rehfeld, Jens Frederik; Gøtze, Jens Peter

    2011-01-01

    (proANP) and its products irrespective of variable post-translational processing. The processing-independent assay (PIA) was developed raising mono-specific antibodies against the C-terminus of sequence 1-16 in proANP. The assay procedure included plasma extraction followed by tryptic cleavage, which...... releases the assay epitope from the N-terminal region. The PIA was tested in elderly patients with symptoms of heart failure (n=450), in pigs with acute myocardial infarction (n=21), and in normal dogs and dogs with heart failure (n=77). The epitope specificity permitted reliable measurement in man, dog...

  10. Dosimetric systems developed in Brazil for the radiation processes quality control

    International Nuclear Information System (INIS)

    Galante, Ana Maria Sisti; Campos, Leticia Lucente

    2011-01-01

    In order to apply new technologies to the industrial processing of materials aiming economy, efficiency, speed and high quality, ionizing radiation has been used in medicine, archaeology, chemistry, food preservation and other areas. For this reason, the dosimetry area looks for improve current dosimeters and develop new materials for application on quality control of these processes. In Brazil, the research in the dosimetry area occurs with great speed providing many different dosimetric systems. The chemical dosimetry is the most used technique in routine dosimetry, which requires fast and accurate responses. This technique involves determination of absorbed dose by measuring chemical changes radiation induced in the materials. Different dosimetric systems were developed at IPEN for application on radiation process quality and all of them present excellent results; the low cost of these materials allows a more effective dose control, therefore, a larger area or volume can be monitored. (author).

  11. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  12. Development process of subjects society

    Directory of Open Access Journals (Sweden)

    A. V. Reshetnichenko

    2014-08-01

    Full Text Available Background due to defining the role of people in the development of society and the almost complete absence of scientific management processes capable of progressive development of both individuals and social communities, and nations, and civilization in general. In order to overcome inherent subjectivist methodology of knowledge, psyholohizatorskyh, hiperpolityzovanyh and utilitarian approach, the authors proposed a three-tier system of business processes of society. The conceptual core of the approach consists in the detection task as logical - mathematical laws of subjects of primary, secondary and higher levels of development, and on the mechanisms of their formation and practice. The solution of the tasks allowed the authors to reveal the structure of both the ascending and descending processes of economic society. Thus, the analysis of individual carriers upward changes as «individual», «individuality», «person» and «personality» showed conditionality determination of their activities with «anthropometric», «ethnic», «demographic» and «ideological» mechanisms. Nature as common carriers downstream changes revealed using correlative related «groups», «group «, «groups» and «communities» whose activity is due to «vitalistic», «education», «professional» and «stratification» mechanisms. To disclose the nature and organization of secondary and higher levels of economic society by the authors introduced the category of «citizen», «heneralista», «human space», «human galactic» ‘formation and development is causing «status», «Persona logical», «humanocentric», «institutional», «cluster», «kontaminatsiyni» and other mechanisms. One of the main achievements of the work, the authors consider the possibility of further development and practical implementation of new quality management processes of economic society based multimodal dialectical logic.

  13. Advanced measurement and analysis of surface textures produced by micro-machining processes

    International Nuclear Information System (INIS)

    Bordatchev, Evgueni V; Hafiz, Abdullah M K

    2014-01-01

    Surface texture of a part or a product has significant effects on its functionality, physical-mechanical properties and visual appearance. In particular for miniature products, the implication of surface quality becomes critical owing to the presence of geometrical features with micro/nano-scale dimensions. Qualitative and quantitative assessments of surface texture are carried out predominantly by profile parameters, which are often insufficient to address the contribution of constituent spatial components with varied amplitudes and wavelengths. In this context, this article presents a novel approach for advanced measurement and analysis of profile average roughness (R a ) and its spatial distribution at different wavelength intervals. The applicability of the proposed approach was verified for three different surface topographies prepared by grinding, laser micro-polishing and micro-milling processes. From the measurement and analysis results, R a (λ) spatial distribution was found to be an effective measure of revealing the contributions of various spatial components within specific wavelength intervals towards formation of the entire surface profile. In addition, the approach was extended to the measurement and analysis of areal average roughness S a (λ) spatial distribution within different wavelength intervals. Besides, the proposed method was demonstrated to be a useful technique in developing a functional correlation between a manufacturing process and its corresponding surface profile. (paper)

  14. Integrating ergonomics into the product development process

    DEFF Research Database (Denmark)

    Broberg, Ole

    1997-01-01

    and production engineers regarding information sources in problem solving, communication pattern, perception of ergonomics, motivation and requests to support tools and methods. These differences and the social and organizational contexts of the development process must be taken into account when considering......A cross-sectional case study was performed in a large company producing electro-mechanical products for industrial application. The purpose was to elucidate conditions and strategies for integrating ergonomics into the product development process thereby preventing ergonomic problems at the time...... of manufacture of new products. In reality the product development process is not a rational problem solving process and does not proceed in a sequential manner as decribed in engineering models. Instead it is a complex organizational process involving uncertainties, iterative elements and negotiation between...

  15. Development of the new data transmission and processing equipment for radiation surveillance

    International Nuclear Information System (INIS)

    Suzuki, Shintaro; Takahashi, Kouichi; Suganami, Jun; Kawai, Toshiaki

    2004-01-01

    In the Mito Atomic Energy Office, which belongs to Ministry of Education, Culture, Sports, Science and Technology, as part of an environmental safety measures of the nuclear institutions in Ibaraki area, the regular surveillance of the environmental monitoring data measured in Japan Nuclear Cycle Development Institute (JNC) and Japan Atomic Energy Research Institute (JAERI) which are main facilities in Oarai and Tokai area is performed. For the purpose of strengthening environmental radiation surveillance in the fiscal year 2003, the data transmission and processing equipment for radiation surveillance is updated, and the new equipment is actually operated from March, 2004. In this paper, the features and functions of the new data transmission and processing equipment are introduced. (author)

  16. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  17. FY 1998 achievement report on the photon measuring/processing technology (R and D of the photon measuring/processing technology); 1998 nendo foton keisoku kako gijutsu seika hokokusho. Foton keisoku kako gijutsu no kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    In this project, the survey/arrangement were made of the trend of the recent technology such as photon (laser) measuring/processing/generation and a possibility of adopting the photon technology to the field except measuring/processing, to clarify technical subjects for establishing/commercializing the photon technology. Also for the purpose of reducing the energy cost by improving the performance of laser processing device, prolonging the life and reducing the operational cost, the development of the following were carried out: (1) high efficiency laser processing device. (2) high conversion efficiency laser diode. In (1), a laser generating device with Yb:YAG crystal as oscillating medium was trially manufactured, and the power of 35W and optical-optical conversion efficiency of 7.1% were obtained. A comparison was also made between Yb:YAG laser and Nd:YAG laser, and made it clear that as the industrial use high power laser, Nd:YAG laser has the advantage over the other. In (2), the development was made of technology for simultaneous uniform growth of more than one LD crystal wafers with high conversion efficiency and technology for evaluation. Namely, the high uniformity crystal wafer with variations among wafers of {+-}4% was obtained using the introduced high efficiency crystal growth device and high efficiency thin film evaluation device. (NEDO)

  18. Optimization of a micro-scale, high throughput process development tool and the demonstration of comparable process performance and product quality with biopharmaceutical manufacturing processes.

    Science.gov (United States)

    Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J

    2017-07-14

    In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  19. DEVELOPMENT AN INSTRUMENT TO MEASURE UNIVERSITY STUDENTS' ATTITUDE TOWARDS E-LEARNING

    OpenAIRE

    Vandana MEHRA; Faranak OMIDIAN

    2012-01-01

    The study of student’s attitude towards e-learning can in many ways help managers better prepare in light of e-learning for the future. This article describes the process of the development of an instrument to measure university students’ attitude towards e-learning. The scale was administered to 200 University students from two countries (India and Iran) .The 83-item attitude towards e-learning scale was developed on six domains as Perceived usefulness ; Intention to adopt e-learning; Ease o...

  20. The measurement problem on classical diffusion process: inverse method on stochastic processes

    International Nuclear Information System (INIS)

    Bigerelle, M.; Iost, A.

    2004-01-01

    In a high number of diffusive systems, measures are processed to calculate material parameters such as diffusion coefficients, or to verify the accuracy of mathematical models. However, the precision of the parameter determination or of the model relevance depends on the location of the measure itself. The aim of this paper is first to analyse, for a mono-dimensional system, the precision of the measure in relation with its location by an inverse problem algorithm and secondly to examine the physical meaning of the results. Statistical mechanic considerations show that, passing over a time-distance criterion, measurement becomes uncertain whatever the initial conditions. The criterion proves that this chaotic mode is related to the production of anti-entropy at a mesoscopique scale that is in violation to quantum theory about measurement

  1. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  2. Extending the agile development process to develop acceptably secure software

    NARCIS (Netherlands)

    Ben Othmane, L.; Angin, P.; Weffers, H.T.G.; Bhargava, B.

    2013-01-01

    The agile software development approach makes developing secure software challenging. Existing approaches for extending the agile development process, which enables incremental and iterative software development, fall short of providing a method for efficiently ensuring the security of the software

  3. Development of computer-controlled ultrasonic image processing system for severe accidents research

    International Nuclear Information System (INIS)

    Koo, Kil Mo; Kang, Kyung Ho; Kim, Jong Tai; Kim, Jong Whan; Cho, Young Ro; Ha, Kwang Soon; Park, Rae Jun; Kim, Sang Baik; Kim, Hee Dong; Sim, Chul Moo

    2000-07-01

    In order to verify in-vessel corium cooling mechanism, LAVA(Lower-plenum Arrested Vessel Attack) experiment is being performed as a first stage proof of principle test. The aims of this study are to find a gap formation between corium melt and reactor lower head vessel, to verify the principle of the gap formation and to analyze the effect of the gap formation on the thermal behavior of corium melt and the lower plenum. This report aims at developing a computer controlled image signal processing system which is able to improve visualization and to measure the gap distribution with 3-dimensional planar image using a time domain signal analysis method as a part of the ultrasonic pulse echo methods and a computerized position control system. An image signal processing system is developed by independently developing an ultrasonic image signal processing technique and a PC controlled position control system and then combining both systems

  4. Development of computer-controlled ultrasonic image processing system for severe accidents research

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Kil Mo; Kang, Kyung Ho; Kim, Jong Tai; Kim, Jong Whan; Cho, Young Ro; Ha, Kwang Soon; Park, Rae Jun; Kim, Sang Baik; Kim, Hee Dong; Sim, Chul Moo

    2000-07-01

    In order to verify in-vessel corium cooling mechanism, LAVA(Lower-plenum Arrested Vessel Attack) experiment is being performed as a first stage proof of principle test. The aims of this study are to find a gap formation between corium melt and reactor lower head vessel, to verify the principle of the gap formation and to analyze the effect of the gap formation on the thermal behavior of corium melt and the lower plenum. This report aims at developing a computer controlled image signal processing system which is able to improve visualization and to measure the gap distribution with 3-dimensional planar image using a time domain signal analysis method as a part of the ultrasonic pulse echo methods and a computerized position control system. An image signal processing system is developed by independently developing an ultrasonic image signal processing technique and a PC controlled position control system and then combining both systems.

  5. Design and measurement of signal processing system for cavity beam position monitor

    International Nuclear Information System (INIS)

    Wang Baopeng; Leng Yongbin; Yu Luyang; Zhou Weimin; Yuan Renxian; Chen Zhichu

    2013-01-01

    In this paper, in order to achieve the output signal processing of cavity beam position monitor (CBPM), we develop a digital intermediate frequency receiver architecture based signal processing system, which consists of radio frequency (RF) front end and high speed data acquisition board. The beam position resolution in the CBPM signal processing system is superior to 1 μm. Two signal processing algorithms, fast Fourier transform (FFT) and digital down converter (DDC), are evaluated offline using MATLAB platform, and both can be used to achieve, the CW input signal, position resolutions of 0.31 μm and 0.10 μm at -16 dBm. The DDC algorithm for its good compatibility is downloaded into the FPGA to realize online measurement, reaching the position resolution of 0.49 μm due to truncation error. The whole system works well and the performance meets design target. (authors)

  6. Development of a calorimetric system for electron beam dosimetry in radiation processing

    International Nuclear Information System (INIS)

    Banados P, H.E.

    1994-01-01

    A calorimetric system for electron beam dosimetry in radiation processing was developed. The system is composed of a graphite core calorimeter, the temperature measuring and electrical calibrating instrumentation, a microcomputer and the software for the system automation. The research aimed at the optimization of the project parameters, the development of advanced methodologies for calibrating the temperature sensor, the determination of the thermal capacity as a function of the temperature, the measurement of the absorbed dose, and the development of the software needed for the system operation. The operating range extends from 0.1 kGy to 30 kGy. The uncertainty in the measurement of the absorbed dose was estimated to be ± 1.8% at the 95% confidence level. Comparative tests of the absorbed dose measurements were made using the IPEN electron accelerator. The results obtained showed an excellent agreement between the absorbed dose determined by the calorimeter and the absorbed dose calculated from the nominal power delivered by the accelerator. (author). 67 refs, 63 figs, 2 tabs

  7. Development of an integrated control and measurement system

    International Nuclear Information System (INIS)

    Manges, W.W.

    1984-03-01

    This thesis presents a tutorial on the issues involved in the development of a minicomputer-based, distributed intelligence data acquisition and process control system to support complex experimental facilities. The particular system discussed in this thesis is under development for the Atomic Vapor Laser Isotope Separation (AVLIS) Program at the Oak Ridge Gaseous Diffusion Plant (ORGDP). In the AVLIS program, we were careful to integrate the computer sections of the implementation into the instrumentation system rather than adding them as an appendage. We then addressed the reliability and availability of the system as a separate concern. Thus, our concept of an integrated control and measurement (ICAM) system forms the basis for this thesis. This thesis details the logic and philosophy that went into the development of this system and explains why the commercially available turn-key systems generally are not suitable. Also, the issues involved in the specification of the components for such an integrated system are emphasized

  8. Process developments in gasoil hydrotreating

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, R.C.; Kinley, D.H.; Wood, M.A. [Davy Process Technology Limited, London (United Kingdom)

    1997-07-01

    Changing demand patterns and legislation increase the pressure upon hydrotreating capacities at many refineries. To meet these pressures, improvements have been and will be necessary not only in catalysts, but also in the hydrotreating process. On the basis of its hydrogenation experience, Davy Process Technology has developed and tested a number of concepts aimed at improving the effectiveness of the basic process - enabling economic deep desulfurisation and opening up the potential for an integrated HDS/HDA flowsheet using sulphur tolerant HDA Catalysts.

  9. Development of the effectiveness measure for an advanced alarm system using signal detection theory

    International Nuclear Information System (INIS)

    Park, J.K.; Choi, S.S.; Hong, J.H.; Chang, S.H.

    1997-01-01

    Since many alarms which are activated during major process deviations or accidents in nuclear power plants can result in negative effects for operators, various types of advanced alarm systems that can select important alarms for the identification of process deviation have been developed to reduce the operator's workload. However, the irrelevant selection of important alarms could distract the operator from correct identification of process deviation. Therefore, to evaluate the effectiveness of the advanced alarm system, a tradeoff between the alarm reduction rate (how many alarms are reduced?) and informativeness (how many important alarms that are conducive to identifying process deviation are provided?) of an advanced alarm system should be considered. In this paper, a new measure is proposed to evaluate the effectiveness of an advanced alarm system with regard to the identification of process deviation. Here, the effectiveness measure is the combination of informativeness measure and reduction rate, and the informativeness measure means the information processing capability performed by the advanced alarm system including wrong rejection and wrong acceptance, and it can be calculated using the signal detection theory (SDT). The effectiveness of the prototype alarm system was evaluated using the loss of coolant accident (LOCA) scenario, and the validity of the effectiveness measure was investigated from two types of the operator response, such as the identification accuracy and the operator's preference for the identification of LOCA

  10. Advances in the Process Development of Biocatalytic Processes

    DEFF Research Database (Denmark)

    Tufvesson, Pär; Lima Ramos, Joana; Al-Haque, Naweed

    2013-01-01

    Biocatalysis is already established in chemical synthesis on an industrial scale, in particular in the pharmaceutical sector. However, the wider implementation of biocatalysis is currently hindered by the extensive effort required to develop a competitive process. In order that resources spent...

  11. PROGame: A process framework for serious game development for motor rehabilitation therapy.

    Science.gov (United States)

    Amengual Alcover, Esperança; Jaume-I-Capó, Antoni; Moyà-Alcover, Biel

    2018-01-01

    Serious game development for rehabilitation therapy is becoming increasingly popular because of the motivational advantages that these types of applications provide. Consequently, the need for a common process framework for this category of software development has become increasingly evident. The goal is to guarantee that products are developed and validated by following a coherent and systematic method that leads to high-quality serious games. This paper introduces a new process framework for the development of serious games for motor rehabilitation therapy. We introduce the new model and demonstrate its application for the development of a serious game for the improvement of the balance and postural control of adults with cerebral palsy. The development of this application has been facilitated by two technological transfer contracts and is being exploited by two different organizations. According to clinical measurements, patients using the application improved from high fall risk to moderate fall risk. We believe that our development strategy can be useful not only for motor rehabilitation therapy, but also for the development of serious games in many other rehabilitation areas.

  12. Measuring the Amount of Mechanical Vibration During Lathe Processing

    Directory of Open Access Journals (Sweden)

    Štefánia SALOKYOVÁ

    2015-06-01

    Full Text Available The article provides basic information regarding the measurement and evaluation of mechanical vibration during the processing of material by lathe work. The lathe processing can be characterized as removing material by precisely defined tools. The results of the experimental part are values of the vibration acceleration amplitude measured by the piezoelectric sensor on the bearing house of the lathe. A set of new knowledge and conclusions is formulated based on the analysis of the created graphical dependencies.

  13. Psychosocial work characteristics of personal care and service occupations: a process for developing meaningful measures for a multiethnic workforce.

    Science.gov (United States)

    Hoppe, Annekatrin; Heaney, Catherine A; Fujishiro, Kaori; Gong, Fang; Baron, Sherry

    2015-01-01

    Despite their rapid increase in number, workers in personal care and service occupations are underrepresented in research on psychosocial work characteristics and occupational health. Some of the research challenges stem from the high proportion of immigrants in these occupations. Language barriers, low literacy, and cultural differences as well as their nontraditional work setting (i.e., providing service for one person in his/her home) make generic questionnaire measures inadequate for capturing salient aspects of personal care and service work. This study presents strategies for (1) identifying psychosocial work characteristics of home care workers that may affect their occupational safety and health and (2) creating survey measures that overcome barriers posed by language, low literacy, and cultural differences. We pursued these aims in four phases: (Phase 1) Six focus groups to identify the psychosocial work characteristics affecting the home care workers' occupational safety and health; (Phase 2) Selection of questionnaire items (i.e., questions or statements to assess the target construct) and first round of cognitive interviews (n = 30) to refine the items in an iterative process; (Phase 3) Item revision and second round of cognitive interviews (n = 11); (Phase 4) Quantitative pilot test to ensure the scales' reliability and validity across three language groups (English, Spanish, and Chinese; total n = 404). Analysis of the data from each phase informed the nature of subsequent phases. This iterative process ensured that survey measures not only met the reliability and validity criteria across groups, but were also meaningful to home care workers. This complex process is necessary when conducting research with nontraditional and multilingual worker populations.

  14. Developments on uranium enrichment processes in France

    International Nuclear Information System (INIS)

    Frejacques, C.; Gelee, M.; Massignon, D.; Plurien, P.

    1977-01-01

    Gaseous diffusion has so far been the main source of supply for enriched uranium and it is only recently that the gas centrifuge came into the picture. Numerous other isotope separation processes have been considered or are being assessed, and there is nothing to exclude the future use of a new process. Developments on likely new processes have been carried out by many organizations both governmental and private. The French Commissariat a l'energie atomique, besides their very extensive endeavours already devoted to gaseous diffusion, have studied and developed the gas centrifuge, chemical exchange, aerodynamic and selective photoexcitation processes. The gaseous diffusion process, selected by Eurodif for the Tricastin plant, and which will also be used by Coredif, is discussed in another paper in these Proceedings. This process is the technico-economic yardstick on which our comparisons are based. Within the limits of their development level, processes are compared on the basis of the separative work cost components: specific investment, specific power consumption and power cost, and specific operating and maintenance costs. (author)

  15. Assessment and Development of Engineering Design Processes

    DEFF Research Database (Denmark)

    Ulrikkeholm, Jeppe Bjerrum

    , the engineering companies need to have efficient engineering design processes in place, so they can design customised product variants faster and more efficiently. It is however not an easy task to model and develop such processes. To conduct engineering design is often a highly iterative, illdefined and complex...... the process can be fully understood and eventually improved. Taking its starting point in this proposition, the outcome of the research is an operational 5-phased procedure for assessing and developing engineering design processes through integrated modelling of product and process, designated IPPM......, and eventually the results are discussed, overall conclusions are made and future research is proposed. The results produced throughout the research project are developed in close collaboration with the Marine Low Speed business unit within the company MAN Diesel & Turbo. The business unit is the world market...

  16. Do the Czech Production Plants Measure the Performance of Energy Processes?

    Directory of Open Access Journals (Sweden)

    Zuzana Tučková

    2016-04-01

    Full Text Available The research was focused to the actual situation in Performance Measurement of the energy processes in Czech production plants. The results are back – upped by the previous researches which were aimed to performance measurement methods usage in the whole organizational structure of the plants. Although the most of big industrial companies declared using of modern Performance Measurements methods, the previous researches shown that it is not purely true. The bigger differences were found in the energy area – energy processes. The authors compared the Energy concepts of European Union (EU and Czech Republic (CZ which are very different and do not create any possibilities for manager’s clear decision in the process management strategy of energy processes in their companies. Next step included the Energy department’s analysis. The significant part of energy processes in the production plants is still not mapped, described and summarized to one methodical manual for managing and performance measurement.

  17. Developing a community-based flood resilience measurement standard

    Science.gov (United States)

    Keating, Adriana; Szoenyi, Michael; Chaplowe, Scott; McQuistan, Colin; Campbell, Karen

    2015-04-01

    literature on resilience in the area of disaster risk (see corresponding abstract of another session). The research gap, which was also highlighted in the 2012 National Academies of Sciences Paper (Disasters, Committee on Science and Public Policy, & Academies, 2012), is the lack of a consistent way to measure resilience, which is a complex systems concept, across different communities and over time. Without this measurement, evaluating the impact of projects, programs and policies on a community's resilience cannot be consistently made. In turn, the relative costs and benefits of potential interventions cannot be properly assessed to determine those which ought to be prioritized. The measurement of resilience contains both theoretical and practical components, but much of the research to date has been limited to the theoretical realm. There is a need for a set of indicators that can be systematically collected in the field to practically measure resilience. This presentation will examine both the theoretical and practical challenges this involves, and how this is being approached through a unique alliance between the research community, a private partner and field practitioners. We aim to help build consistency amongst those working on assessing and prioritizing effective resilience strategies. The Alliance between research partners and NGOs will be highlighted to show how such collaborations can support a continuous learning process in communities and contribute to improved flood resilience at community level and beyond. This includes the development and use of innovative evaluation tools that can aid communities in prioritizing projects and policies as well as demonstrating effectiveness to donors.

  18. Talking Back to the Media Ideal: The Development and Validation of the Critical Processing of Beauty Images Scale

    Science.gov (United States)

    Engeln-Maddox, Renee; Miller, Steven A.

    2008-01-01

    This article details the development of the Critical Processing of Beauty Images Scale (CPBI) and studies demonstrating the psychometric soundness of this measure. The CPBI measures women's tendency to engage in critical processing of media images featuring idealized female beauty. Three subscales were identified using exploratory factor analysis…

  19. Broadband Outdoor Radiometer Calibration Process for the Atmospheric Radiation Measurement Program

    Energy Technology Data Exchange (ETDEWEB)

    Dooraghi, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-09-01

    The Atmospheric Radiation Measurement program (ARM) maintains a fleet of monitoring stations to aid in the improved scientific understanding of the basic physics related to radiative feedback processes in the atmosphere, particularly the interactions among clouds and aerosols. ARM obtains continuous measurements and conducts field campaigns to provide data products that aid in the improvement and further development of climate models. All of the measurement campaigns include a suite of solar measurements. The Solar Radiation Research Laboratory at the National Renewable Energy Laboratory supports ARM's full suite of stations in a number of ways, including troubleshooting issues that arise as part of the data-quality reviews; managing engineering changes to the standard setup; and providing calibration services and assistance to the full fleet of solar-related instruments, including pyranometers, pyrgeometers, pyrheliometers, as well as the temperature/relative humidity probes, multimeters, and data acquisition systems that are used in the calibrations performed at the Southern Great Plains Radiometer Calibration Facility. This paper discusses all aspects related to the support provided to the calibration of the instruments in the solar monitoring fleet.

  20. Development of acidic processes for decontaminating LMFBR components

    Energy Technology Data Exchange (ETDEWEB)

    Hill, E F [Rockwell International, Atomics International Division, Canoga Park (United States); Colburn, R P; Lutton, J M; Maffei, H P [Hanford Engineering Development Laboratory, Richland (United States)

    1978-08-01

    The objective of the DOE decontamination program is to develop a well characterized chemical decontamination process for application to LMFBR primary system components that subsequently permits contact maintenance and allows requalification of the components for reuse in reactors. The paper describes the subtasks of deposit characterization, development of requalification and process acceptance criteria, development of process evaluation techniques and studies which led to a new acidic process for decontaminating 304 stainless steel hot leg components.

  1. Development of acidic processes for decontaminating LMFBR components

    International Nuclear Information System (INIS)

    Hill, E.F.; Colburn, R.P.; Lutton, J.M.; Maffei, H.P.

    1978-01-01

    The objective of the DOE decontamination program is to develop a well characterized chemical decontamination process for application to LMFBR primary system components that subsequently permits contact maintenance and allows requalification of the components for reuse in reactors. The paper describes the subtasks of deposit characterization, development of requalification and process acceptance criteria, development of process evaluation techniques and studies which led to a new acidic process for decontaminating 304 stainless steel hot leg components

  2. Development of Dual-Retrieval Processes in Recall: Learning, Forgetting, and Reminiscence

    Science.gov (United States)

    Brainerd, C. J.; Aydin, C.; Reyna, V. F.

    2012-01-01

    We investigated the development of dual-retrieval processes with a low-burden paradigm that is suitable for research with children and neurocognitively impaired populations (e.g., older adults with mild cognitive impairment or dementia). Rich quantitative information can be obtained about recollection, reconstruction, and familiarity judgment by defining a Markov model over simple recall tasks like those that are used in clinical neuropsychology batteries. The model measures these processes separately for learning, forgetting, and reminiscence. We implemented this procedure in some developmental experiments, whose aims were (a) to measure age changes in recollective and nonrecollective retrieval during learning, forgetting, and reminiscence and (b) to measure age changes in content dimensions (e.g., taxonomic relatedness) that affect the two forms of retrieval. The model provided excellent fits in all three domains. Concerning (a), recollection, reconstruction, and familiarity judgment all improved during the child-to-adolescent age range in the learning domain, whereas only recollection improved in the forgetting domain, and the processes were age-invariant in the reminiscence domain. Concerning (b), although some elements of the adult pattern of taxonomic relatedness effects were detected by early adolescence, the adult pattern differs qualitatively from corresponding patterns in children and adolescents. PMID:22778491

  3. Controlling the Instructional Development Process. Training Development and Research Center Project Number Fifteen.

    Science.gov (United States)

    Sleezer, Catherine M.; Swanson, Richard A.

    Process control is a way of training managers in business and industry to plan, monitor, and communicate the instructional development process of training projects. Two simple and useful tools that managers use in controlling the process of instructional development are the Process Control Planning Sheet and the Process Control Record. The Process…

  4. MEASURING PRODUCTIVITY OF SOFTWARE DEVELOPMENT TEAMS

    Directory of Open Access Journals (Sweden)

    Goparaju Purna Sudhakar

    2012-02-01

    Full Text Available This paper gives an exhaustive literature review of the techniques and models available tomeasure the productivity of software development teams. Definition of productivity, measuringindividual programmer’s productivity, and measuring software development team productivity arediscussed. Based on the literature review it was found that software productivity measurement canbe done using SLOC (Source Lines of Code, function points, use case points, object points, andfeature points. Secondary research findings indicate that the team size, response time, taskcomplexity, team climate and team cohesion have an impact on software development teamproductivity. List of factors affecting the software development team productivity are studied andreviewed.

  5. Development of a New Fundamental Measuring Technique for the Accurate Measurement of Gas Flowrates by Means of Laser Doppler Anemometry

    Science.gov (United States)

    Dopheide, D.; Taux, G.; Krey, E.-A.

    1990-01-01

    In the Physikalisch-Technische Bundesanstalt (PTB), a research test facility for the accurate measurement of gas (volume and mass) flowrates has been set up in the last few years on the basis of a laser Doppler anemometer (LDA) with a view to directly measuring gas flowrates with a relative uncertainty of only 0,1%. To achieve this, it was necessary to develop laser Doppler anemometry into a precision measuring technique and to carry out detailed investigations on stationary low-turbulence nozzle flow. The process-computer controlled test facility covers the flowrate range from 100 to 4000 m3/h (~0,03 - 1,0 m3/s), any flowrate being measured directly, immediately and without staggered arrangement of several flow meters. After the development was completed, several turbine-type gas meters were calibrated and international comparisons carried out. The article surveys the most significant aspects of the work and provides an outlook on future developments with regard to the miniaturization of optical flow and flowrate sensors for industrial applications.

  6. Development of coal partial hydropyrolysis process

    Energy Technology Data Exchange (ETDEWEB)

    Hideaki Yabe; Takafumi Kawamura; Kohichiroh Gotoh; Akemitsu Akimoto [Nippon Steel Corporation, Chiba (Japan)

    2005-07-01

    Coal partial hydropyrolysis process aims at co-production of high yield of light oil such as BTX and naphthalene and synthesis gas from a low rank coal under a mild hydropyrolysis condition. The characteristic of this process is in the two-staged entrained hydropyrolysis reactor composed of the reformer and gasifier. This reactor arrangement gives us high heat efficiency of this process. So far, in order to evaluate the process concept a small-scale basic experiment and a 1t/day process development unit study were carried out. The experimental results showed that coal volatiles were partially hydrogenated to increase the light oil and hydrocarbon gases at the condition of partial hydropyrolysis such as pressure of 2-3MPa, temperature of 700-900{sup o}C and hydrogen concentration of 30-50%. This process has a possibility of producing efficiently and economically liquid and gas products as chemicals and fuel for power generation. As a further development in the period of 2003 to 2008, a 20t/day pilot plant study named ECOPRO (efficient co-production with coal flash hydropyrolysis technology) has been started to establish the process technologies for commercialization. 12 refs., 6 figs., 3 tabs.

  7. Development of NPP control room operators's mental workload measurement system using bioelectric signals

    International Nuclear Information System (INIS)

    Shim, Bong Sik; Oh, In Seok; Lee, Hyun Cheol; Cha, Kyung Ho; Lee, Dong Ha

    1996-09-01

    This study developed mentalload measurement system based on the relations between mentalload and physiological responses of the human operators. The measurement system was composed of the telemetry system for EEG, EOG, ECG and respiration pattern of the subjects, A/D converter, the physiological signal processing programs (compiled by the Labview). The signal processing programs transformed the physiological signal into the scores indicating mentalload status of the subjects and recorded the mentalload scores in the form of the table of a database. The acqknowledge and the labview programs additionally transformed the mentalload score database and the operator behavior database so that both database were consolidated into one. 94 figs., 57 refs. (Author)

  8. Effect of measurement conditions on three-dimensional roughness values, and development of measurement standard

    International Nuclear Information System (INIS)

    Fabre, A; Brenier, B; Raynaud, S

    2011-01-01

    Friction or corrosion behaviour, fatigue lifetime for mechanical components are influenced by their boundary and subsurface properties. The surface integrity is studied on mechanical component in order to improve the service behaviour of them. Roughness is one of the main geometrical properties, which is to be qualified and quantified. Components can be obtained using a complex process: forming, machining and treatment can be combined to realize parts with complex shape. Then, three-dimensional roughness is needed to characterize these parts with complex shape and textured surface. With contact or non-contact measurements (contact stylus, confocal microprobe, interferometer), three-dimensional roughness is quantified using the calculation of pertinent parameters defined by the international standard PR EN ISO 25178-2:2008. An analysis will identify the influence of measurement conditions on three-dimensional parameters. The purpose of this study is to analyse the variation of roughness results using contact stylus or optical apparatus. The second aim of this work is to develop a measurement standard well adapted to qualify the contact and non-contact apparatus.

  9. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  10. Batch statistical process control of a fluid bed granulation process using in-line spatial filter velocimetry and product temperature measurements.

    Science.gov (United States)

    Burggraeve, A; Van den Kerkhof, T; Hellings, M; Remon, J P; Vervaet, C; De Beer, T

    2011-04-18

    Fluid bed granulation is a batch process, which is characterized by the processing of raw materials for a predefined period of time, consisting of a fixed spraying phase and a subsequent drying period. The present study shows the multivariate statistical modeling and control of a fluid bed granulation process based on in-line particle size distribution (PSD) measurements (using spatial filter velocimetry) combined with continuous product temperature registration using a partial least squares (PLS) approach. Via the continuous in-line monitoring of the PSD and product temperature during granulation of various reference batches, a statistical batch model was developed allowing the real-time evaluation and acceptance or rejection of future batches. Continuously monitored PSD and product temperature process data of 10 reference batches (X-data) were used to develop a reference batch PLS model, regressing the X-data versus the batch process time (Y-data). Two PLS components captured 98.8% of the variation in the X-data block. Score control charts in which the average batch trajectory and upper and lower control limits are displayed were developed. Next, these control charts were used to monitor 4 new test batches in real-time and to immediately detect any deviations from the expected batch trajectory. By real-time evaluation of new batches using the developed control charts and by computation of contribution plots of deviating process behavior at a certain time point, batch losses or reprocessing can be prevented. Immediately after batch completion, all PSD and product temperature information (i.e., a batch progress fingerprint) was used to estimate some granule properties (density and flowability) at an early stage, which can improve batch release time. Individual PLS models relating the computed scores (X) of the reference PLS model (based on the 10 reference batches) and the density, respectively, flowabililty as Y-matrix, were developed. The scores of the 4 test

  11. Decision Gate Process for Assessment of a NASA Technology Development Portfolio

    Science.gov (United States)

    Kohli, Rajiv; Fishman, Julianna L.; Hyatt, Mark J.

    2012-01-01

    The NASA Dust Management Project (DMP) was established to provide technologies (to Technology Readiness Level (TRL) 6) required to address adverse effects of lunar dust to humans and to exploration systems and equipment, to reduce life cycle cost and risk, and to increase the probability of sustainable and successful lunar missions. The technology portfolio of DMP consisted of different categories of technologies whose final product was either a technology solution in itself, or one that contributes toward a dust mitigation strategy for a particular application. A Decision Gate Process (DGP) was developed to assess and validate the achievement and priority of the dust mitigation technologies as the technologies progress through the development cycle. The DGP was part of continuous technology assessment and was a critical element of DMP risk management. At the core of the process were technology-specific criteria developed to measure the success of each DMP technology in attaining the technology readiness levels assigned to each decision gate. The DGP accounts for both categories of technologies and qualifies the technology progression from technology development tasks to application areas. The process provided opportunities to validate performance, as well as to identify non-performance in time to adjust resources and direction. This paper describes the overall philosophy of the DGP and the methodology for implementation for DMP, and describes the method for defining the technology evaluation criteria. The process is illustrated by example of an application to a specific DMP technology.

  12. The Model of Measuring the Process and Operating Effectiveness in Enterprises

    Directory of Open Access Journals (Sweden)

    Radosław Ryńca

    2009-12-01

    Full Text Available Process management means, that the employees are not responsible for realization of particular functions, but for achieving the specified results of this actions. In the literature, you can find the examples of using the process approach at companies. Many firms have already identified the processes and are trying to manage them. That’s why, it is necessary to use the tool, which can enable the measurement and assessment of these processes. In the article, the author show the proposal of measurement the process and operating effectiveness.

  13. Key Features of the Manufacturing Vision Development Process

    DEFF Research Database (Denmark)

    Dukovska-Popovska, Iskra; Riis, Jens Ove; Boer, Harry

    2005-01-01

    of action research. The methodology recommends wide participation of people from different hierarchical and functional positions, who engage in a relatively short, playful and creative process and come up with a vision (concept) for the future manufacturing system in the company. Based on three case studies......This paper discusses the key features of the process of Manufacturing Vision Development, a process that enables companies to develop their future manufacturing concept. The basis for the process is a generic five-phase methodology (Riis and Johansen 2003) developed as a result of ten years...... of companies going through the initial phases of the methodology, this research identified the key features of the Manufacturing Vision Development process. The paper elaborates the key features by defining them, discussing how and when they can appear, and how they influence the process....

  14. Itataia project - Development of the process

    International Nuclear Information System (INIS)

    Coelho, S.V.

    1987-01-01

    A process for treating the phosphorus uraniferous ore, from Itataia-CE mine in Brazil, was developed, establishing the basic flow chart for recovery two products: uranium concentrate and phosphoric acid. The developed process consists in physical concentration, chemical separation, solvent extraction, and it presented, in laboratory and pilot scales, recovery levels which assure the project viability technicaly and economicaly. The consolidation of project and the description of installations are presented by a documentary film. (M.C.K.) [pt

  15. Itataia project - Development of the process

    International Nuclear Information System (INIS)

    Coelho, S.V.

    1987-01-01

    A process for treating the phosphorous uraniferous ore, from Itataia-CE mine in Brazil, was developed, establishing the basic flow chart for recovery two products: uranium concentrate and phosphoric acid. The developed process consists in physical concentration, chemical separation, solvent extraction, and it presented, in laboratory and pilot scales, recovery leves which assure the project viability technically and economically. The consolidation of project and the description of installations are presented by a documentary film. (M.C.K.) [pt

  16. Advancing implementation science through measure development and evaluation: a study protocol.

    Science.gov (United States)

    Lewis, Cara C; Weiner, Bryan J; Stanick, Cameo; Fischer, Sarah M

    2015-07-22

    Significant gaps related to measurement issues are among the most critical barriers to advancing implementation science. Three issues motivated the study aims: (a) the lack of stakeholder involvement in defining pragmatic measure qualities; (b) the dearth of measures, particularly for implementation outcomes; and (c) unknown psychometric and pragmatic strength of existing measures. Aim 1: Establish a stakeholder-driven operationalization of pragmatic measures and develop reliable, valid rating criteria for assessing the construct. Aim 2: Develop reliable, valid, and pragmatic measures of three critical implementation outcomes, acceptability, appropriateness, and feasibility. Aim 3: Identify Consolidated Framework for Implementation Research and Implementation Outcome Framework-linked measures that demonstrate both psychometric and pragmatic strength. For Aim 1, we will conduct (a) interviews with stakeholder panelists (N = 7) and complete a literature review to populate pragmatic measure construct criteria, (b) Q-sort activities (N = 20) to clarify the internal structure of the definition, (c) Delphi activities (N = 20) to achieve consensus on the dimension priorities, (d) test-retest and inter-rater reliability assessments of the emergent rating system, and (e) known-groups validity testing of the top three prioritized pragmatic criteria. For Aim 2, our systematic development process involves domain delineation, item generation, substantive validity assessment, structural validity assessment, reliability assessment, and predictive validity assessment. We will also assess discriminant validity, known-groups validity, structural invariance, sensitivity to change, and other pragmatic features. For Aim 3, we will refine our established evidence-based assessment (EBA) criteria, extract the relevant data from the literature, rate each measure using the EBA criteria, and summarize the data. The study outputs of each aim are expected to have a positive impact

  17. About Solar Radiation Intensity Measurements and Data Processing

    Directory of Open Access Journals (Sweden)

    MICH-VANCEA Claudiu

    2012-10-01

    Full Text Available Measuring the intensity of solar radiation is one of the directions of investigation necessary for the implementation of photovoltaic systems in a particular geographical area. This can be done by using specific measuring equipment (pyranometer sensors based onthermal or photovoltaic principle. In this paper it is presented a method for measuring solar radiation (which has two main components - direct radiation and diffuse radiation with sensors based on photovoltaic principle. Such data are processed for positioning solarpanels, in order their efficiency to be maximized.

  18. The Development of 1Balance: A Connected Medical Device for Measuring Human Balance

    Directory of Open Access Journals (Sweden)

    Heikki Sjöman

    2018-05-01

    Full Text Available Prototyping (iterative loops of design–build–test is a proven method of efficiently developing new products. Developing products not only quickly, but that are also fit for purpose, implies engaging the end users and iterating the technology at hand. However, there is currently little research on how engineering design can approach developing connected devices. The purpose of this paper is to distinguish and discuss design approaches that are suitable for connected devices. Internet of Things devices consist of both the physical products themselves and the data that is coming out of the products, which we define as the external and internal data, respectively. They both can be prototyped separately, but since the data acquired can influence the design of the device and vice versa, we propose to link these two together in the product development process. This issue becomes more apparent when designing networks of sensors, e.g., for complex artificial intelligence (AI databases. We explain the principle by describing the development of 1Balance through six different prototypes for human balance measurement. Technologically quantifying balance is an underused approach for objectively evaluating the state of a human’s performance. The authors have developed a mobile application for monitoring balance as a physiological signal (amount of sway via a compact wireless inertial measurement unit (IMU sensor strapped to the body of the subject for the duration of the measurement. We describe the design process for developing this connected medical device, as well as how the acquired data was used to improve the design of the product. In conclusion, we propose conceptually connecting the external and internal data prototyping loops.

  19. Development of a Measuring System Based on LabVIEW for Angular Stiffness of Integrative Flexible Joint

    International Nuclear Information System (INIS)

    Liu, C J; Wan, D A

    2006-01-01

    In order to meet the need of development of integrative flexible joint, this paper presents a higher precision measuring system for angular stiffness test of integrative flexible joint. The main parts of the system include PC, precision motorized goniometric stage, precision motorized rotary stage and high accuracy torque sensor. The measuring and control program is developed on the platform of LabVIEW. The measuring system developed has angular resolution at 0.00032 deg. (about 1'') theoretically in determining the angular displacement of the joint round its equatorial axis and torque accuracy at 0.005 mN · m. The developed program, which presents a friendly GUI, can implement the data acquisition and processing, measuring procedure automatically. In comparison with other measuring devices with similar purposes, the measuring device can improve the measuring efficiency and accuracy distinctly while has advantages of simple configuration, low cost and high stability

  20. Personal computer interface for temmperature measuring in the cutting process with turning

    International Nuclear Information System (INIS)

    Trajchevski, Neven; Filipovski, Velimir; Kuzinonovski, Mikolaj

    2004-01-01

    The computer development aided reserch systems in the investigations of the characteristics of the surface layar forms conditions for decreasing of the measuring uncertainty. Especially important is the fact that the usage of open and self made measuring systems accomplishes the demands for a total control of the research process. This paper describes an original personal computer interface which is used in the newly built computer aided reserrch system for temperatute measuring in the machining with turning. This interface consists of optically-coupled linear isolation amplifier and an analog to digital (A/D) converter. It is designed for measuring of the themo- voltage that is a generated from the natural thermocouple workpiece-cutting tool. That is achived by digitalizing the value of the thermo-voltage in data which is transmitted to the personal computer. The interface realization is a result of the research activity of the faculty of Mechanical Engineering and the Faculty of Electrical Engineering in Skopje.

  1. Development of measures to evaluate youth advocacy for obesity prevention

    OpenAIRE

    Millstein, Rachel A.; Woodruff, Susan I.; Linton, Leslie S.; Edwards, Christine C.; Sallis, James F.

    2016-01-01

    Background Youth advocacy has been successfully used in substance use prevention but is a novel strategy in obesity prevention. As a precondition for building an evidence base for youth advocacy for obesity prevention, the present study aimed to develop and evaluate measures of youth advocacy mediator, process, and outcome variables. Methods The Youth Engagement and Action for Health (YEAH!) program (San Diego County, CA) engaged youth and adult group leaders in advocacy for school and neighb...

  2. Performance Measurement of Location Enabled e-Government Processes: A Use Case on Traffic Safety Monitoring

    Science.gov (United States)

    Vandenbroucke, D.; Vancauwenberghe, G.

    2016-12-01

    The European Union Location Framework (EULF), as part of the Interoperable Solutions for European Public Administrations (ISA) Programme of the EU (EC DG DIGIT), aims to enhance the interactions between governments, businesses and citizens by embedding location information into e-Government processes. The challenge remains to find scientific sound and at the same time practicable approaches to estimate or measure the impact of location enablement of e-Government processes on the performance of the processes. A method has been defined to estimate process performance in terms of variables describing the efficiency, effectiveness, as well as the quality of the output of the work processes. A series of use cases have been identified, corresponding to existing e-Government work processes in which location information could bring added value. In a first step, the processes are described by means of BPMN (Business Process Model and Notation) to better understand the process steps, the actors involved, the spatial data flows, as well as the required input and the generated output. In a second step the processes are assessed in terms of the (sub-optimal) use of location information and the potential enhancement of the process by better integrating location information and services. The process performance is measured ex ante (before using location enabled e-Government services) and ex-post (after the integration of such services) in order to estimate and measure the impact of location information. The paper describes the method for performance measurement and highlights how the method is applied to one use case, i.e. the process of traffic safety monitoring. The use case is analysed and assessed in terms of location enablement and its potential impact on process performance. The results of applying the methodology on the use case revealed that performance is highly impacted by factors such as the way location information is collected, managed and shared throughout the

  3. Management of Talent Development Process in Sport

    OpenAIRE

    SEVİMLİ, Dilek

    2015-01-01

    In the development of elite athletes, talent identification and education, is a complex and multidimensional process. It is difficult to predict the future performance depending on the increasing amount of technical, tactical, conditioning and psychological needs in a sport. Factors such as children’s developmental stages and levels, gender, athlete development programs, social support, the quality of coaches, access to equipment and facilities can affect talent development process.Phases of ...

  4. Classroom processes and positive youth development: conceptualizing, measuring, and improving the capacity of interactions between teachers and students.

    Science.gov (United States)

    Pianta, Robert C; Hamre, Bridget K

    2009-01-01

    The National Research Council's (NRC) statement and description of features of settings that have value for positive youth development have been of great importance in shifting discourse toward creating programs that capitalize on youth motivations toward competence and connections with others. This assets-based approach to promote development is consistent with the Classroom Assessment Scoring System (CLASS) framework for measuring and improving the quality of teacher-student interactions in classroom settings. This chapter highlights the similarities between the CLASS and NRC systems and describes the CLASS as a tool for standardized measurement and improvement of classrooms and their effects on children. It argues that the next important steps to be taken in extending the CLASS and NRC frameworks involve reengineering assessments of teacher and classroom quality and professional development around observations of teachers' performance. This might include using observations in policies regarding teacher quality or a "highly effective teacher" that may emanate from the reauthorization of No Child Left Behind and moving away from a course or workshop mode of professional development to one that ties supports directly to teachers' practices in classroom settings.

  5. Measurement of spatial correlation functions using image processing techniques

    International Nuclear Information System (INIS)

    Berryman, J.G.

    1985-01-01

    A procedure for using digital image processing techniques to measure the spatial correlation functions of composite heterogeneous materials is presented. Methods for eliminating undesirable biases and warping in digitized photographs are discussed. Fourier transform methods and array processor techniques for calculating the spatial correlation functions are treated. By introducing a minimal set of lattice-commensurate triangles, a method of sorting and storing the values of three-point correlation functions in a compact one-dimensional array is developed. Examples are presented at each stage of the analysis using synthetic photographs of cross sections of a model random material (the penetrable sphere model) for which the analytical form of the spatial correlations functions is known. Although results depend somewhat on magnification and on relative volume fraction, it is found that photographs digitized with 512 x 512 pixels generally have sufficiently good statistics for most practical purposes. To illustrate the use of the correlation functions, bounds on conductivity for the penetrable sphere model are calculated with a general numerical scheme developed for treating the singular three-dimensional integrals which must be evaluated

  6. Optimization of Power Consumption for Centrifugation Process Based on Attenuation Measurements

    Science.gov (United States)

    Salim, M. S.; Abd Malek, M. F.; Sabri, Naseer; Omar, M. Iqbal bin; Mohamed, Latifah; Juni, K. M.

    2013-04-01

    The main objective of this research is to produce a mathematical model that allows decreasing the electrical power consumption of centrifugation process based on attenuation measurements. The centrifugation time for desired separation efficiency may be measured to determine the power consumed of laboratory centrifuge device. The power consumption is one of several parameters that affect the system reliability and productivity. Attenuation measurements of wave propagated through blood sample during centrifugation process were used indirectly to measure the power consumption of device. A mathematical model for power consumption was derived and used to modify the speed profile of centrifuge controller. The power consumption model derived based on attenuation measurements has successfully save the power consumption of centrifugation process keeping high separation efficiency. 18kW.h monthly for 100 daily time device operation had been saved using the proposed model.

  7. Optimization of Power Consumption for Centrifugation Process Based on Attenuation Measurements

    International Nuclear Information System (INIS)

    Salim, M S; Iqbal bin Omar, M; Malek, M F Abd; Mohamed, Latifah; Sabri, Naseer; Juni, K M

    2013-01-01

    The main objective of this research is to produce a mathematical model that allows decreasing the electrical power consumption of centrifugation process based on attenuation measurements. The centrifugation time for desired separation efficiency may be measured to determine the power consumed of laboratory centrifuge device. The power consumption is one of several parameters that affect the system reliability and productivity. Attenuation measurements of wave propagated through blood sample during centrifugation process were used indirectly to measure the power consumption of device. A mathematical model for power consumption was derived and used to modify the speed profile of centrifuge controller. The power consumption model derived based on attenuation measurements has successfully save the power consumption of centrifugation process keeping high separation efficiency. 18kW.h monthly for 100 daily time device operation had been saved using the proposed model.

  8. INFLUANCE OF TRAINING PROCESS ON DEVELOPMENT OF EXSPLOSIVE STRENGHT OF LEGS AT PIONEER BASKETBALL PLAYERS

    Directory of Open Access Journals (Sweden)

    Vukašin Badža

    2011-03-01

    Full Text Available The research is trying to give an answer on a question whether the training process which lasted two years influenced on development of exsplosive strenght of legs (one of basic psysical abilities at pioneer basketball players. The sample in a research are basketball players from „Danubius - Vojvodina Srbijagas” team, from Novi Sad. The group counts 20 players who participated in training process and on who measuring was conducted. Data were collected per two years, and measuring were exert three times during this period. Tests that were used in this research are: long jump, high jump and running 20m. The obtained data will be processed statistically, using ANOVA for repeated measures and descriptive statistics. Research results will be represented in tables and discussed in text.

  9. Development of a Scale-up Tool for Pervaporation Processes

    Directory of Open Access Journals (Sweden)

    Holger Thiess

    2018-01-01

    Full Text Available In this study, an engineering tool for the design and optimization of pervaporation processes is developed based on physico-chemical modelling coupled with laboratory/mini-plant experiments. The model incorporates the solution-diffusion-mechanism, polarization effects (concentration and temperature, axial dispersion, pressure drop and the temperature drop in the feed channel due to vaporization of the permeating components. The permeance, being the key model parameter, was determined via dehydration experiments on a mini-plant scale for the binary mixtures ethanol/water and ethyl acetate/water. A second set of experimental data was utilized for the validation of the model for two chemical systems. The industrially relevant ternary mixture, ethanol/ethyl acetate/water, was investigated close to its azeotropic point and compared to a simulation conducted with the determined binary permeance data. Experimental and simulation data proved to agree very well for the investigated process conditions. In order to test the scalability of the developed engineering tool, large-scale data from an industrial pervaporation plant used for the dehydration of ethanol was compared to a process simulation conducted with the validated physico-chemical model. Since the membranes employed in both mini-plant and industrial scale were of the same type, the permeance data could be transferred. The comparison of the measured and simulated data proved the scalability of the derived model.

  10. Plasma assisted measurements of alkali metal concentrations in pressurized combustion processes

    Energy Technology Data Exchange (ETDEWEB)

    Hernberg, R.; Haeyrinen, V. [Tampere Univ. of Technology (Finland). Dept. of Physics

    1996-12-01

    The plasma assisted method for continuous measurement of alkali concentrations in product gas flows of pressurized energy processes will be tested and applied at the 1.6 MW PFBC/G facility at Delft University of Technology in the Netherlands. During the reporting period the alkali measuring device has been tested under pressurized conditions at VTT Energy, DMT, Foster-Wheeler Energia and ABB Carbon. Measurements in Delft will be performed during 1996 after installation of the hot gas filter. The original plan for measurements in Delft has been postponed due to schedule delays in Delft. The results are expected to give information about the influence of different process conditions on the generation of alkali vapours, the comparison of different methods for alkali measurement and the specific performance of our system. This will be the first test of the plasma assisted measurement method in a gasification process. The project belongs to the Joule II extension program under contract JOU2-CT93-0431. (author)

  11. Development of a prototype acquisition and data processing system based on FPGA

    International Nuclear Information System (INIS)

    Romero, L; Bellino, P

    2012-01-01

    We present the first stage of the expansion and improvement of a signal acquisition system based on FPGA. This system will acquire and process signals from nuclear detectors working in both pulse and current mode. The aim of this development is to unify all the actual systems for physical measurements in nuclear facilities and reactors (author)

  12. Development of hybrid fluid jet/float polishing process

    Science.gov (United States)

    Beaucamp, Anthony T. H.; Namba, Yoshiharu; Freeman, Richard R.

    2013-09-01

    On one hand, the "float polishing" process consists of a tin lap having many concentric grooves, cut from a flat by single point diamond turning. This lap is rotated above a hydrostatic bearing spindle of high rigidity, damping and rotational accuracy. The optical surface thus floats above a thin layer of abrasive particles. But whilst surface texture can be smoothed to ~0.1nm rms (as measured by atomic force microscopy), this process can only be used on flat surfaces. On the other hand, the CNC "fluid jet polishing" process consists of pumping a mixture of water and abrasive particles to a converging nozzle, thus generating a polishing spot that can be moved along a tool path with tight track spacing. But whilst tool path feed can be moderated to ultra-precisely correct form error on freeform optical surfaces, surface finish improvement is generally limited to ~1.5nm rms (with fine abrasives). This paper reports on the development of a novel finishing method, that combines the advantages of "fluid jet polishing" (i.e. freeform corrective capability) with "float polishing" (i.e. super-smooth surface finish of 0.1nm rms or less). To come up with this new "hybrid" method, computational fluid dynamic modeling of both processes in COMSOL is being used to characterize abrasion conditions and adapt the process parameters of experimental fluid jet polishing equipment, including: (1) geometrical shape of nozzle, (2) position relative to the surface, (3) control of inlet pressure. This new process is aimed at finishing of next generation X-Ray / Gamma Ray focusing optics.

  13. Measurement system of bubbly flow using ultrasonic velocity profile monitor and video data processing unit

    International Nuclear Information System (INIS)

    Aritomi, Masanori; Zhou, Shirong; Nakajima, Makoto; Takeda, Yasushi; Mori, Michitsugu; Yoshioka, Yuzuru.

    1996-01-01

    The authors have been developing a measurement system for bubbly flow in order to clarify its multi-dimensional flow characteristics and to offer a data base to validate numerical codes for multi-dimensional two-phase flow. In this paper, the measurement system combining an ultrasonic velocity profile monitor with a video data processing unit is proposed, which can measure simultaneously velocity profiles in both gas and liquid phases, a void fraction profile for bubbly flow in a channel, and an average bubble diameter and void fraction. Furthermore, the proposed measurement system is applied to measure flow characteristics of a bubbly countercurrent flow in a vertical rectangular channel to verify its capability. (author)

  14. Development and validation of resource flexibility measures for manufacturing industry

    Directory of Open Access Journals (Sweden)

    Gulshan Chauhan

    2014-01-01

    Full Text Available Purpose: Global competition and ever changing customers demand have made manufacturing organizations to rapidly adjust to complexities, uncertainties, and changes. Therefore, flexibility in manufacturing resources is necessary to respond cost effectively and rapidly to changing production needs and requirements.  Ability of manufacturing resources to dynamically reallocate from one stage of a production process to another in response to shifting bottlenecks is recognized as resource flexibility. This paper aims to develop and validate resource flexibility measures for manufacturing industry that could be used by managers/ practitioners in assessing and improving the status of resource flexibility for the optimum utilization of resources. Design/methodology/approach: The study involves survey carried out in Indian manufacturing industry using a questionnaire to assess the status of various aspects of resource flexibility and their relationships. A questionnaire was specially designed covering various parameters of resource flexibility. Its reliability was checked by finding the value of Cronback alpha (0.8417. Relative weightage of various measures was found out by using Analytical Hierarchy Process (AHP. Pearson’s coefficient of correlation analysis was carried out to find out relationships between various parameters. Findings: From detailed review of literature on resource flexibility, 17 measures of resource flexibility and 47 variables were identified. The questionnaire included questions on all these measures and parameters. ‘Ability of machines to perform diverse set of operations’ and ability of workers to work on different machines’ emerged to be important measures with contributing weightage of 20.19% and 17.58% respectively.  All the measures were found to be significantly correlated with overall resource flexibility except ‘training of workers’, as shown by Pearson’s coefficient of correlation. This indicates that

  15. Integrating Usability Evaluations into the Software Development Process

    DEFF Research Database (Denmark)

    Lizano, Fulvio

    as relevant and strategic human–computer interaction (HCI) activities in the software development process, there are obstacles that limit the complete, effective and efficient integration of this kind of testing into the software development process. Two main obstacles are the cost of usability evaluations...... and the software developers' resistance to accepting users’ opinions regarding the lack of usability in their software systems. The ‘cost obstacle’ refers to the constraint of conducting usability evaluations in the software process due to the significant amount of resources required by this type of testing. Some......This thesis addresses the integration of usability evaluations into the software development process. The integration here is contextualized in terms of how to include usability evaluation as an activity in the software development lifecycle. Even though usability evaluations are considered...

  16. Brownian Motion as a Limit to Physical Measuring Processes

    DEFF Research Database (Denmark)

    Niss, Martin

    2016-01-01

    In this paper, we examine the history of the idea that noise presents a fundamental limit to physical measuring processes. This idea had its origins in research aimed at improving the accuracy of instruments for electrical measurements. Out of these endeavors, the Swedish physicist Gustaf A. Ising...

  17. Organizational Development: Values, Process, and Technology.

    Science.gov (United States)

    Margulies, Newton; Raia, Anthony P.

    The current state-of-the-art of organizational development is the focus of this book. The five parts into which the book is divided are as follows: Part One--Introduction (Organizational Development in Perspective--the nature, values, process, and technology of organizational development); Part Two--The Components of Organizational Developments…

  18. Measuring Child Development and Learning

    Science.gov (United States)

    Raikes, Abbie

    2017-01-01

    The Sustainable Development Goal's "Education 2030" agenda includes an explicit focus on early childhood development. Target 4.2 states that all children are "developmentally on track" at the start of school. What does it mean for a child to be developmentally on track, and how should it be measured, especially in an…

  19. Happy software developers solve problems better: psychological measurements in empirical software engineering.

    Science.gov (United States)

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.

  20. Process development and fabrication for sphere-pac fuel rods

    International Nuclear Information System (INIS)

    Welty, R.K.; Campbell, M.H.

    1981-06-01

    Uranium fuel rods containing sphere-pac fuel have been fabricated for in-reactor tests and demonstrations. A process for the development, qualification, and fabrication of acceptable sphere-pac fuel rods is described. Special equipment to control fuel contamination with moisture or air and the equipment layout needed for rod fabrication is described and tests for assuring the uniformity of the fuel column are discussed. Fuel retainers required for sphere-pac fuel column stability and instrumentation to measure fuel column smear density are described. Results of sphere-pac fuel rod fabrication campaigns are reviewed and recommended improvements for high throughput production are noted

  1. Development of modified FT (MFT) process

    Energy Technology Data Exchange (ETDEWEB)

    Jinglai Zhou; Zhixin Zhang; Wenjie Shen [Institute of Coal Chemistry, Taiyuan (China)] [and others

    1995-12-31

    Two-Stage Modified FT (MFT) process has been developed for producing high-octane gasoline from coal-based syngas. The main R&D are focused on the development of catalysts and technologies process. Duration tests were finished in the single-tube reactor, pilot plant (100T/Y), and industrial demonstration plant (2000T/Y). A series of satisfactory results has been obtained in terms of operating reliability of equipments, performance of catalysts, purification of coal - based syngas, optimum operating conditions, properties of gasoline and economics etc. Further scaling - up commercial plant is being considered.

  2. Evaluation of Heat Flux Measurement as a New Process Analytical Technology Monitoring Tool in Freeze Drying.

    Science.gov (United States)

    Vollrath, Ilona; Pauli, Victoria; Friess, Wolfgang; Freitag, Angelika; Hawe, Andrea; Winter, Gerhard

    2017-05-01

    This study investigates the suitability of heat flux measurement as a new technique for monitoring product temperature and critical end points during freeze drying. The heat flux sensor is tightly mounted on the shelf and measures non-invasively (no contact with the product) the heat transferred from shelf to vial. Heat flux data were compared to comparative pressure measurement, thermocouple readings, and Karl Fischer titration as current state of the art monitoring techniques. The whole freeze drying process including freezing (both by ramp freezing and controlled nucleation) and primary and secondary drying was considered. We found that direct measurement of the transferred heat enables more insights into thermodynamics of the freezing process. Furthermore, a vial heat transfer coefficient can be calculated from heat flux data, which ultimately provides a non-invasive method to monitor product temperature throughout primary drying. The end point of primary drying determined by heat flux measurements was in accordance with the one defined by thermocouples. During secondary drying, heat flux measurements could not indicate the progress of drying as monitoring the residual moisture content. In conclusion, heat flux measurements are a promising new non-invasive tool for lyophilization process monitoring and development using energy transfer as a control parameter. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  3. Measuring and explaining house price developments

    NARCIS (Netherlands)

    De Vries, P.

    2010-01-01

    This study discusses ways of measuring and explaining the development of house prices. The goal of the research underpinning this dissertation was to develop a methodological framework for studying these developments. This framework relates, first, to correcting for changes in the composition of

  4. Application of non-destructive liner thickness measurement technique for manufacturing and inspection process of zirconium lined cladding tube

    International Nuclear Information System (INIS)

    Nakazawa, Norio; Fukuda, Akihiro; Fujii, Noritsugu; Inoue, Koichi

    1986-01-01

    Recently, in order to meet the difference of electric power demand owing to electric power situation, large scale load following operation has become necessary. Therefore, the development of the cladding tubes which withstand power variation has been carried out, as the result, zirconium-lined zircaloy 2 cladding tubes have been developed. In order to reduce the sensitivity to stress corrosion cracking, these zirconium-lined cladding tubes require uniform liner thickness over the whole surface and whole length. Kobe Steel Ltd. developed the nondestructive liner thickness measuring technique based on ultrasonic flaw detection technique and eddy current flaw detection technique. These equipments were applied to the manufacturing and inspection processes of the zirconium-lined cladding tubes, and have demonstrated superiority in the control and assurance of the liner thickness of products. Zirconium-lined cladding tubes, the development of the measuring technique for guaranteeing the uniform liner thickness and the liner thickness control in the manufacturing and inspection processes are described. (Kako, I.)

  5. Contribution to the development of a multi-mode measurement system for dynamic neutronic measurements and processing of the related uncertainties

    International Nuclear Information System (INIS)

    Geslot, B.

    2006-11-01

    It is difficult to estimate integral reactor parameters, especially reactivity, in deeply subcritical cores. Indeed the standard neutronic methods have been designed for near critical reactivity levels and they often need a critical reference. This thesis takes part in the research on ADS (Accelerated Driven Systems), for which the multiplication coefficient would be about 0.95. The first part of the thesis deals with the development of the XMODE system. It is a flexible measurement system dedicated to experiments in neutronics. X-MODE is capable of acquiring logical signals particularly in time-stamping mode as well as analogical signals. The second part of the thesis presents a statistical study of the methods used to analyse flux transients. Indeed a lot of methods exist to analyse flux transients and some are little known. Means to estimate characteristics of reactivity estimators are provided, methods compared and recommendations made. Finally, the dynamic measurements of the TRADE program are analysed and discussed. During this program, three subcritical configurations were explored. It appears that pulsed neutron source experiments give reactivity estimations that are much more precise than those obtained from flux transients. (author)

  6. Development of Flexible Software Process Lines with Variability Operations

    DEFF Research Database (Denmark)

    Schramm, Joachim; Dohrmann, Patrick; Kuhrmann, Marco

    2015-01-01

    families of processes and, as part of this, variability operations provide means to modify and reuse pre-defined process assets. Objective: Our goal is to evaluate the feasibility of variability operations to support the development of flexible software process lines. Method: We conducted a longitudinal......Context: Software processes evolve over time and several approaches were proposed to support the required flexibility. Yet, little is known whether these approaches sufficiently support the development of large software processes. A software process line helps to systematically develop and manage...

  7. Developing a Workflow Composite Score to Measure Clinical Information Logistics. A Top-down Approach.

    Science.gov (United States)

    Liebe, J D; Hübner, U; Straede, M C; Thye, J

    2015-01-01

    Availability and usage of individual IT applications have been studied intensively in the past years. Recently, IT support of clinical processes is attaining increasing attention. The underlying construct that describes the IT support of clinical workflows is clinical information logistics. This construct needs to be better understood, operationalised and measured. It is therefore the aim of this study to propose and develop a workflow composite score (WCS) for measuring clinical information logistics and to examine its quality based on reliability and validity analyses. We largely followed the procedural model of MacKenzie and colleagues (2011) for defining and conceptualising the construct domain, for developing the measurement instrument, assessing the content validity, pretesting the instrument, specifying the model, capturing the data and computing the WCS and testing the reliability and validity. Clinical information logistics was decomposed into the descriptors data and information, function, integration and distribution, which embraced the framework validated by an analysis of the international literature. This framework was refined selecting representative clinical processes. We chose ward rounds, pre- and post-surgery processes and discharge as sample processes that served as concrete instances for the measurements. They are sufficiently complex, represent core clinical processes and involve different professions, departments and settings. The score was computed on the basis of data from 183 hospitals of different size, ownership, location and teaching status. Testing the reliability and validity yielded encouraging results: the reliability was high with r(split-half) = 0.89, the WCS discriminated between groups; the WCS correlated significantly and moderately with two EHR models and the WCS received good evaluation results by a sample of chief information officers (n = 67). These findings suggest the further utilisation of the WCS. As the WCS does not

  8. Understanding and Managing Process Interaction in IS Development Projects

    DEFF Research Database (Denmark)

    Bygstad, Bendik; Nielsen, Peter Axel

    2005-01-01

    Increasingly, information systems must be developed and implemented as a part of business change. This is a challenge for the IS project manager, since business change and information systems development usually are performed as separate processes. Thus, there is a need to understand and manage......-technical innovation in a situation where the organisational change process and the IS development process are parallel but incongruent. We also argue that iterative software engineering frameworks are well structured to support process interaction. Finally, we advocate that the IS project manager needs to manage...... the relationship between these two kinds of processes. To understand the interaction between information systems development and planned organisational change we introduce the concept of process interaction. We draw on a longitudinal case study of an IS development project that used an iterative and incremental...

  9. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  10. DEVELOPMENT AN INSTRUMENT TO MEASURE UNIVERSITY STUDENTS' ATTITUDE TOWARDS E-LEARNING

    Directory of Open Access Journals (Sweden)

    Vandana MEHRA

    2012-01-01

    Full Text Available The study of student’s attitude towards e-learning can in many ways help managers better prepare in light of e-learning for the future. This article describes the process of the development of an instrument to measure university students’ attitude towards e-learning. The scale was administered to 200 University students from two countries (India and Iran .The 83-item attitude towards e-learning scale was developed on six domains as Perceived usefulness ; Intention to adopt e-learning; Ease of e-learning use; Technical and pedagogical support; E-learning stressors ; Pressure to use e-learning.

  11. Interviewing to develop Patient-Reported Outcome (PRO) measures for clinical research: eliciting patients’ experience

    Science.gov (United States)

    2014-01-01

    Patient-reported outcome (PRO) measures must provide evidence that their development followed a rigorous process for ensuring their content validity. To this end, the collection of data is performed through qualitative interviews that allow for the elicitation of in-depth spontaneous reports of the patients’ experiences with their condition and/or its treatment. This paper provides a review of qualitative research applied to PRO measure development. A clear definition of what is a qualitative research interview is given as well as information about the form and content of qualitative interviews required for developing PRO measures. Particular attention is paid to the description of interviewing approaches (e.g., semi-structured and in-depth interviews, individual vs. focus group interviews). Information about how to get prepared for a qualitative interview is provided with the description of how to develop discussion guides for exploratory or cognitive interviews. Interviewing patients to obtain knowledge regarding their illness experience requires interpersonal and communication skills to facilitate patients’ expression. Those skills are described in details, as well as the skills needed to facilitate focus groups and to interview children, adolescents and the elderly. Special attention is also given to quality assurance and interview training. The paper ends on ethical considerations since interviewing for the development of PROs is performed in a context of illness and vulnerability. Therefore, it is all the more important that, in addition to soliciting informed consent, respectful interactions be ensured throughout the interview process. PMID:24499454

  12. Development of flow velocity measurement techniques in visible images. Improvement of particle image velocimetry techniques on image process

    International Nuclear Information System (INIS)

    Kimura, Nobuyuki; Nishimura, Motohiko; Kamide, Hideki; Hishida, Koichi

    1999-10-01

    Noise reduction system was developed to improve applicability of Particle Image Velocimetry (PIV) to complicated configure bounded flows. For fast reactor safety and thermal hydraulic studies, experiments are performed in scale models which usually have rather complicated geometry and structures such as fuel subassemblies, heat exchangers, etc. The structures and stuck dusts on the view window of the models obscure the particle image. Thus the image except the moving particles can be regarded as a noise. In the present study, two noise reduction techniques are proposed. The one is the Time-averaged Light Intensity Subtraction method (TIS) which subtracts the time-averaged light intensity of each pixel in the sequential images from the each corresponding pixel. The other one is the Minimum Light Intensity Subtraction method (MIS) which subtracts the minimum light intensity of each pixel in the sequential images from the each corresponding pixel. Both methods are examined on their capabilities of noise reduction. As for the original 'bench mark' image, the image made from Large Eddy Simulation was used. To the bench mark image, noises are added which are referred as sample images. Both methods reduce the rate of vector with the error of more than one pixel from 90% to less than 5%. Also, more than 50% of the vectors have the error of less than 0.2 pixel. The analysis of uncertainty shows that these methods enhances the accuracy of vector measurement 3 ∼ 12 times if the image with noise were processed, and the MIS method has 1.1 ∼ 2.1 times accuracy compared to the TIS. Thus the present noise reduction methods are quite efficient to enhance the accuracy of flow velocity fields measured with particle images including structures and deposits on the view window. (author)

  13. Software process improvement: controlling developers, managers or users?

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob

    1999-01-01

    The paper discusses how the latest trend in the management of software development: software process improvement (SPI) may affect user-developer relations. At the outset, SPI concerns the "internal workings" of software organisations, but it may also be interpreted as one way to give the developer...... organisation more control over the development process and the relations with the user organization....

  14. Measurements of scattering processes in negative ion-atom collisions

    International Nuclear Information System (INIS)

    Kvale, T.J.

    1991-01-01

    This research project is designed to provide measurements of various scattering processes which occur in H - collisions with atomic targets at intermediate energies. The immediate goal is to study elastic scattering, single electron detachment, and target excitation/ionization in H - scattering from noble gas targets. For the target inelastic processes, these cross sections are unknown both experimentally and theoretically. The present measurements will provide either experimentally-determined cross sections or set upper limits to those cross sections. In either case, these measurements will be stringent tests of our understanding in energetic negative ion collisions. This series of experiments required the construction of a new facility, and significant progress toward its operation has been realized during this period. The proposed research is described in this report. The progress on and the status of the apparatus is also detailed in this report

  15. Development of 3D online contact measurement system for intelligent manufacturing based on stereo vision

    Science.gov (United States)

    Li, Peng; Chong, Wenyan; Ma, Yongjun

    2017-10-01

    In order to avoid shortcomings of low efficiency and restricted measuring range exsited in traditional 3D on-line contact measurement method for workpiece size, the development of a novel 3D contact measurement system is introduced, which is designed for intelligent manufacturing based on stereo vision. The developed contact measurement system is characterized with an intergarted use of a handy probe, a binocular stereo vision system, and advanced measurement software.The handy probe consists of six track markers, a touch probe and the associated elcetronics. In the process of contact measurement, the hand probe can be located by the use of the stereo vision system and track markers, and 3D coordinates of a space point on the workpiece can be mearsured by calculating the tip position of a touch probe. With the flexibility of the hand probe, the orientation, range, density of the 3D contact measurenent can be adptable to different needs. Applications of the developed contact measurement system to high-precision measurement and rapid surface digitization are experimentally demonstrated.

  16. Development of a Survey to Assess Local Health Department Organizational Processes and Infrastructure for Supporting Obesity Prevention.

    Science.gov (United States)

    Xiao, Ting; Stamatakis, Katherine A; McVay, Allese B

    Local health departments (LHDs) have an important function in controlling the growing epidemic of obesity in the United States. Data are needed to gain insight into the existence of routine functions and structures of LHDs that support and sustain obesity prevention efforts. The purpose of this study was to develop and examine the reliability of measures to assess foundational LHD organizational processes and functions specific to obesity prevention. Survey measures were developed using a stratified, random sample of US LHDs to assess supportive organizational processes and infrastructure for obesity prevention representing different domains. Data were analyzed using weighted κ and intraclass correlation coefficient for assessing test-retest reliability. Most items and summary indices in the majority of survey domains had moderate/substantial or almost perfect reliability. The overall findings support this survey instrument to be a reliable measurement tool for a large number of processes and functions that comprise obesity prevention-related capacity in LHDs.

  17. A case study on measuring process quality : lessons learned

    NARCIS (Netherlands)

    Dikici, A.; Türetken, O.; Demirors, O.; Cortellessa, V.; Muccini, H.; Demirors, O.

    2012-01-01

    Requiring solid engineering disciplines and best practices rather than human talents for developing complex software systems results an increasing interest in software processes. The quality of software processes has considerable influence over the success of an organization. Process quality

  18. Development of the Fischer-Tropsch Process: From the Reaction Concept to the Process Book

    Directory of Open Access Journals (Sweden)

    Boyer C.

    2016-05-01

    Full Text Available The process development by IFP Energies nouvelles (IFPEN/ENI/Axens of a Fischer-Tropsch process is described. This development is based on upstream process studies to choose the process scheme, reactor technology and operating conditions, and downstream to summarize all development work in a process guide. A large amount of work was devoted to the catalyst performances on one hand and the scale-up of the slurry bubble reactor with dedicated complementary tools on the other hand. Finally, an original approach was implemented to validate both the process and catalyst on an industrial scale by combining a 20 bpd unit in ENI’s Sannazzaro refinery, with cold mock-ups equivalent to 20 and 1 000 bpd at IFPEN and a special “Large Validation Tool” (LVT which reproduces the combined effect of chemical reaction condition stress and mechanical stress equivalent to a 15 000 bpd industrial unit. Dedicated analytical techniques and a dedicated model were developed to simulate the whole process (reactor and separation train, integrating a high level of complexity and phenomena coupling to scale-up the process in a robust reliable base on an industrial scale.

  19. Development of Precise Point Positioning Method Using Global Positioning System Measurements

    Directory of Open Access Journals (Sweden)

    Byung-Kyu Choi

    2011-09-01

    Full Text Available Precise point positioning (PPP is increasingly used in several parts such as monitoring of crustal movement and maintaining an international terrestrial reference frame using global positioning system (GPS measurements. An accuracy of PPP data processing has been increased due to the use of the more precise satellite orbit/clock products. In this study we developed PPP algorithm that utilizes data collected by a GPS receiver. The measurement error modelling including the tropospheric error and the tidal model in data processing was considered to improve the positioning accuracy. The extended Kalman filter has been also employed to estimate the state parameters such as positioning information and float ambiguities. For the verification, we compared our results to other of International GNSS Service analysis center. As a result, the mean errors of the estimated position on the East-West, North-South and Up-Down direction for the five days were 0.9 cm, 0.32 cm, and 1.14 cm in 95% confidence level.

  20. Computer processing of the Δlambda/lambda measured results

    International Nuclear Information System (INIS)

    Draguniene, V.J.; Makariuniene, E.K.

    1979-01-01

    For the processing of the experimental data on the influence of the chemical environment on the radioactive decay constants, five programs have been written in the Fortran language in the version for the monitoring system DUBNA on the BESM-6 computer. Each program corresponds to a definite stage of data processing and acquirement of the definite answer. The first and second programs are calculation of the ratio of the pulse numbers measured with different sources and calculation of the mean value of dispersions. The third program is the averaging of the ratios of the pulse numbers. The fourth and the fifth are determination of the change of the radioactive decay constant. The created programs for the processing of the measurement results permit the processing of the experimental data beginning from the values of pulse numbers obtained directly in the experiments. The programs allow to treat a file of the experimental results, to calculated various errors in all the stages of the calculations. Printing of the obtained results is convenient for usage

  1. Development of an on-line analyzer for organic phase uranium concentration in extraction process

    International Nuclear Information System (INIS)

    Dong Yanwu; Song Yufen; Zhu Yaokun; Cong Peiyuan; Cui Songru

    1998-10-01

    The working principle, constitution, performance of an on-line analyzer and the development characteristic of immersion sonde, data processing system and examination standard are reported. The performance of this instrument is reliable. For identical sample, the signal fluctuation in continuous monitoring for four months is less than +-1%. According to required measurement range by choosing appropriate length of sample cell the precision of measurement is better than 1% at uranium concentration 100 g/L. The detection limit is (50 +- 10) mg/L. The uranium concentration in process stream can be automatically displayed and printed out in real time and 4∼20 mA current signal being proportional to the uranium concentration can be presented. So the continuous control and computer management for the extraction process can be achieved

  2. The Optimization of the Local Public Policies’ Development Process Through Modeling And Simulation

    Directory of Open Access Journals (Sweden)

    Minodora URSĂCESCU

    2012-06-01

    Full Text Available The local public policies development in Romania represents an empirically realized measure, the strategic management practices in this domain not being based on a scientific instrument capable to anticipate and evaluate the results of implementing a local public policy in a logic of needs-policies-effects type. Beginning from this motivation, the purpose of the paper resides in the reconceptualization of the public policies process on functioning principles of the dynamic systems with inverse connection, by means of mathematical modeling and techniques simulation. Therefore, the research is oriented in the direction of developing an optimization method for the local public policies development process, using as instruments the mathematical modeling and the techniques simulation. The research’s main results are on the one side constituted by generating a new process concept of the local public policies, and on the other side by proposing the conceptual model of a complex software product which will permit the parameterized modeling in a virtual environment of these policies development process. The informatic product’s finality resides in modeling and simulating each local public policy type, taking into account the respective policy’s characteristics, but also the value of their appliance environment parameters in a certain moment.

  3. Development of material measures for performance verifying surface topography measuring instruments

    International Nuclear Information System (INIS)

    Leach, Richard; Giusca, Claudiu; Rickens, Kai; Riemer, Oltmann; Rubert, Paul

    2014-01-01

    The development of two irregular-geometry material measures for performance verifying surface topography measuring instruments is described. The material measures are designed to be used to performance verify tactile and optical areal surface topography measuring instruments. The manufacture of the material measures using diamond turning followed by nickel electroforming is described in detail. Measurement results are then obtained using a traceable stylus instrument and a commercial coherence scanning interferometer, and the results are shown to agree to within the measurement uncertainties. The material measures are now commercially available as part of a suite of material measures aimed at the calibration and performance verification of areal surface topography measuring instruments

  4. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    Energy Technology Data Exchange (ETDEWEB)

    Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id [Center for Energy Studies, Gadjah Mada University, Sekip K-1A Kampus UGM, Yogyakarta 55281 (Indonesia); Department of Mechanical and Industrial Engineering, Faculty of Engineering, Gadjah Mada University, Jalan Grafika 2, Yogyakarta 55281 (Indonesia); Hudaya, Akhmad Zidni; Dinaryanto, Okto [Department of Mechanical and Industrial Engineering, Faculty of Engineering, Gadjah Mada University, Jalan Grafika 2, Yogyakarta 55281 (Indonesia)

    2016-06-03

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.

  5. External quality measurements reveal internal processes

    NARCIS (Netherlands)

    Schouten, R.E.; Costa, J.M.; Kooten, van O.

    2003-01-01

    With the present developments in CA technology it becomes possible to fine tune the storage conditions to the specific needs of the product. This generates the need to know the exact quality conditions of the product before storage starts. By measuring the initial quality we can determine these

  6. Development of process maps for plasma spray: case study for molybdenum

    International Nuclear Information System (INIS)

    Sampath, S.; Jiang, X.; Kulkarni, A.; Matejicek, J.; Gilmore, D.L.; Neiser, R.A.

    2003-01-01

    A schematic representation referred to as 'process maps' examines the role of process variables on the properties of plasma-sprayed coatings. Process maps have been developed for air plasma spraying of molybdenum. Experimental work was done to investigate the importance of such spray parameters as gun current, primary gas flow, auxiliary gas flow, and powder carrier gas flow. In-flight particle temperatures and velocities were measured and diameters estimated in various areas of the spray plume. Empirical models were developed relating the input parameters to the in-flight particle characteristics. Molybdenum splats and coatings were produced at three distinct process conditions identified from the first-order process map experiments. In addition, substrate surface temperature during deposition was treated as a variable. Within the tested range, modulus, hardness and thermal conductivity increases with particle velocity, while oxygen content and porosity decreases. Increasing substrate deposition temperature resulted in dramatic improvement in coating thermal conductivity and modulus, while simultaneously increasing coating oxide content. Indentation reveals improved fracture resistance for the coatings prepared at higher substrate temperature. Residual stress was significantly affected by substrate temperature, although not to a great extent by particle conditions within the investigated parameter range. Coatings prepared at high substrate temperature with high-energy particles suffered considerably less damage in a wear test. The mechanisms behind these changes are discussed within the context relational maps, which have been proposed

  7. Kinematic analysis of in situ measurement during chemical mechanical planarization process

    Energy Technology Data Exchange (ETDEWEB)

    Li, Hongkai; Wang, Tongqing; Zhao, Qian; Meng, Yonggang; Lu, Xinchun, E-mail: xclu@tsinghua.edu.cn [State Key Laboratory of Tribology, Tsinghua University, Beijing 100084 (China)

    2015-10-15

    Chemical mechanical planarization (CMP) is the most widely used planarization technique in semiconductor manufacturing presently. With the aid of in situ measurement technology, CMP tools can achieve good performance and stable productivity. However, the in situ measurement has remained unexplored from a kinematic standpoint. The available related resources for the kinematic analysis are very limited due to the complexity and technical secret. In this paper, a comprehensive kinematic analysis of in situ measurement is provided, including the analysis model, the measurement trajectory, and the measurement time of each zone of wafer surface during the practical CMP process. In addition, a lot of numerical calculations are performed to study the influences of main parameters on the measurement trajectory and the measurement velocity variation of the probe during the measurement process. All the efforts are expected to improve the in situ measurement system and promote the advancement in CMP control system.

  8. Initial development of the Psychopathic Processing and Personality Assessment (PAPA) across populations.

    Science.gov (United States)

    Lewis, Michael; Ireland, Jane L; Abbott, Janice; Ireland, Carol A

    Three studies describe development of the Psychopathic Processing and Personality Assessment (PAPA). Study one outlines a literature review and Expert Delphi (n=32) to develop the initial PAPA. Study two validates the PAPA with 431 participants (121 male prisoners and 310 university students: 154 men, 156 women), also using the Levenson Self Report Psychopathy scale and a measure of cognitive schema and affect. Study three refined the PAPA, employing it with 50 male students and 40 male forensic psychiatric patients using clinical (interview) assessments of psychopathy: the Psychopathy Checklist - Screening Version and the Affect, Cognitive and Lifestyle assessment. The PAPA comprised four factors; dissocial tendencies; emotional detachment; disregard for others; and lack of sensitivity to emotion. It positively correlated with existing psychopathy measures. Variations across PAPA subscales were noted across samples when associated with clinical measures of psychopathy. Support for the validity of the PAPA was indicated across samples. Directions for research and application are outlined. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Development of NPP control room operators`s mental workload measurement system using bioelectric signals

    Energy Technology Data Exchange (ETDEWEB)

    Shim, Bong Sik; Oh, In Seok; Lee, Hyun Cheol; Cha, Kyung Ho [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Lee, Dong Ha [Suwon Univ., Suwon (Korea, Republic of)

    1996-09-01

    This study developed mentalload measurement system based on the relations between mentalload and physiological responses of the human operators. The measurement system was composed of the telemetry system for EEG, EOG, ECG and respiration pattern of the subjects, A/D converter, the physiological signal processing programs (compiled by the Labview). The signal processing programs transformed the physiological signal into the scores indicating mentalload status of the subjects and recorded the mentalload scores in the form of the table of a database. The acqknowledge and the labview programs additionally transformed the mentalload score database and the operator behavior database so that both database were consolidated into one. 94 figs., 57 refs. (Author).

  10. Electrocortical measures of information processing biases in social anxiety disorder: A review.

    Science.gov (United States)

    Harrewijn, Anita; Schmidt, Louis A; Westenberg, P Michiel; Tang, Alva; van der Molen, Melle J W

    2017-10-01

    Social anxiety disorder (SAD) is characterized by information processing biases, however, their underlying neural mechanisms remain poorly understood. The goal of this review was to give a comprehensive overview of the most frequently studied EEG spectral and event-related potential (ERP) measures in social anxiety during rest, anticipation, stimulus processing, and recovery. A Web of Science search yielded 35 studies reporting on electrocortical measures in individuals with social anxiety or related constructs. Social anxiety was related to increased delta-beta cross-frequency correlation during anticipation and recovery, and information processing biases during early processing of faces (P1) and errors (error-related negativity). These electrocortical measures are discussed in relation to the persistent cycle of information processing biases maintaining SAD. Future research should further investigate the mechanisms of this persistent cycle and study the utility of electrocortical measures in early detection, prevention, treatment and endophenotype research. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Developments of quad channel pulse height analyzer for radon/thoron measurement

    International Nuclear Information System (INIS)

    Ashokkumar, P.; Raman, Anand; Babu, D.A.R.; Sharma, D.N.; Topkar, Anita; Mayya, Y.S.

    2011-01-01

    Radon and thoron are naturally occurring noble radioactive gases, the exposure to which has a linear relationship to lung cancer risk. This paper describes development of an automated Radon/Thoron measurement system using an indigenously developed silicon PIN diode. This system employs the 8051 core architecture based Si-lab microcontroller (C-8051F340) integrated with LCD display, hex key pad, non volatile flash memory besides I/O ports interfaced with humidity-temperature sensors and air sampling pump. Air is sampled through a dehumidifier by using a software controlled dc pump. The positively charged progeny atoms are electro statically collected over the detector surface and the deposited radioactivity is assessed by alpha pulse height discrimination technique. The ionization charges produced due to the interaction of alpha particles in the charge depletion region of the diode which is reverse biased at 40V are collected and measured. The measurement circuit uses a charge sensitive preamplifier developed around a low noise opamp IC. The pulses are further processed through a spectroscopy amplifier to obtain distinct pulse height levels for four of the alpha emitting progenies of Rn and Tn namely 210 Po, 214 Po, 216 Po and 212 Po. These signals are input to the quad channel analyzer which provides four individual TTL pulses corresponding to four nuclides mentioned above. The analyzer outputs are processed by the microcontroller module to obtain the Rn/Tn concentration in Bq/M 3 . This portable system stores one week hourly individual channel data along with the corresponding Rn/Tn concentrations, temperature, humidity and can be transferred to pc. Preliminary studies have indicated that sensitivity as low as 0.50 cph/Bq.m -3 can be achieved by this system. (author)

  12. The Development of a Flexible Measuring System for Muscle Volume Using Ultrasonography

    Science.gov (United States)

    Fukumoto, Kiyotaka; Fukuda, Osamu; Tsubai, Masayoshi; Muraki, Satoshi

    Quantification of muscle volume can be used as a means for the estimation of muscle strength. Its measuring process does not need the subject's muscular contractions so it is completely safe and particularly suited for elderly people. Therefore, we have developed a flexible measuring system for muscle volume using ultrasonography. In this system, an ultrasound probe is installed on a link mechanism which continuously scans fragmental images along the human body surface. These images are then measured and composed into a wide area cross-sectional image based on the spatial compounding method. The flexibility of the link mechanism enables the operator to measure the images under any body postures and body site. The spatial compounding method significantly reduces speckle and artifact noises from the composed cross-sectional image so that the operator can observe the individual muscles, such as Rectus femoris, Vastus intermedius, and so on, in detail. We conducted the experiments in order to examine the advantages of this system we have developed. The experimental results showed a high accuracy of the measuring position which was calculated using the link mechanism and presented the noise reduction effect based on the spatial compounding method. Finally, we confirmed high correlations between the MRI images and the ones of the developed system to verify the validity of the system.

  13. Assessing Therapist Competence: Development of a Performance-Based Measure and Its Comparison With a Web-Based Measure.

    Science.gov (United States)

    Cooper, Zafra; Doll, Helen; Bailey-Straebler, Suzanne; Bohn, Kristin; de Vries, Dian; Murphy, Rebecca; O'Connor, Marianne E; Fairburn, Christopher G

    2017-10-31

    Recent research interest in how best to train therapists to deliver psychological treatments has highlighted the need for rigorous, but scalable, means of measuring therapist competence. There are at least two components involved in assessing therapist competence: the assessment of their knowledge of the treatment concerned, including how and when to use its strategies and procedures, and an evaluation of their ability to apply such knowledge skillfully in practice. While the assessment of therapists' knowledge has the potential to be completed efficiently on the Web, the assessment of skill has generally involved a labor-intensive process carried out by clinicians, and as such, may not be suitable for assessing training outcome in certain circumstances. The aims of this study were to develop and evaluate a role-play-based measure of skill suitable for assessing training outcome and to compare its performance with a highly scalable Web-based measure of applied knowledge. Using enhanced cognitive behavioral therapy (CBT-E) for eating disorders as an exemplar, clinical scenarios for role-play assessment were developed and piloted together with a rating scheme for assessing trainee therapists' performance. These scenarios were evaluated by examining the performance of 93 therapists from different professional backgrounds and at different levels of training in implementing CBT-E. These therapists also completed a previously developed Web-based measure of applied knowledge, and the ability of the Web-based measure to efficiently predict competence on the role-play measure was investigated. The role-play measure assessed performance at implementing a range of CBT-E procedures. The majority of the therapists rated their performance as moderately or closely resembling their usual clinical performance. Trained raters were able to achieve good-to-excellent reliability for averaged competence, with intraclass correlation coefficients ranging from .653 to 909. The measure was

  14. A unified inversion scheme to process multifrequency measurements of various dispersive electromagnetic properties

    Science.gov (United States)

    Han, Y.; Misra, S.

    2018-04-01

    Multi-frequency measurement of a dispersive electromagnetic (EM) property, such as electrical conductivity, dielectric permittivity, or magnetic permeability, is commonly analyzed for purposes of material characterization. Such an analysis requires inversion of the multi-frequency measurement based on a specific relaxation model, such as Cole-Cole model or Pelton's model. We develop a unified inversion scheme that can be coupled to various type of relaxation models to independently process multi-frequency measurement of varied EM properties for purposes of improved EM-based geomaterial characterization. The proposed inversion scheme is firstly tested in few synthetic cases in which different relaxation models are coupled into the inversion scheme and then applied to multi-frequency complex conductivity, complex resistivity, complex permittivity, and complex impedance measurements. The method estimates up to seven relaxation-model parameters exhibiting convergence and accuracy for random initializations of the relaxation-model parameters within up to 3-orders of magnitude variation around the true parameter values. The proposed inversion method implements a bounded Levenberg algorithm with tuning initial values of damping parameter and its iterative adjustment factor, which are fixed in all the cases shown in this paper and irrespective of the type of measured EM property and the type of relaxation model. Notably, jump-out step and jump-back-in step are implemented as automated methods in the inversion scheme to prevent the inversion from getting trapped around local minima and to honor physical bounds of model parameters. The proposed inversion scheme can be easily used to process various types of EM measurements without major changes to the inversion scheme.

  15. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  16. Development of digital photogrammetry for measurements of displacements in underground excavation

    International Nuclear Information System (INIS)

    Ohnishi, Yuzo; Ohtsu, Hiroyasu; Nishiyama, Satoshi; Ono, Tetsu; Matsui, Hiroya

    2002-03-01

    Because deformations are important indicators of the degree of stability during construction of rock structures, monitoring of deformation is a key element of construction of tunnels and structures for the underground research laboratory. Especially in the construction and maintenance of underground excavation, monitoring of deformations is needed for obtaining useful information to control its stability. We have been developing the application of digital photogrammetry to monitoring techniques in rock structures. Photogrammetric process has undergone a remarkable evolution with its transformation into digital photogrammetry. Photogrammetry has the advantage of measuring deformation of an object by some photos with easy measurements and excellent cost performance. In this paper, we present that the digital photogrammetry can monitor the displacements of the underground excavation accurately along with a capability of real-time measurement. (author)

  17. Modified SPC for short run test and measurement process in multi-stations

    Science.gov (United States)

    Koh, C. K.; Chin, J. F.; Kamaruddin, S.

    2018-03-01

    Due to short production runs and measurement error inherent in electronic test and measurement (T&M) processes, continuous quality monitoring through real-time statistical process control (SPC) is challenging. Industry practice allows the installation of guard band using measurement uncertainty to reduce the width of acceptance limit, as an indirect way to compensate the measurement errors. This paper presents a new SPC model combining modified guard band and control charts (\\bar{\\text{Z}} chart and W chart) for short runs in T&M process in multi-stations. The proposed model standardizes the observed value with measurement target (T) and rationed measurement uncertainty (U). S-factor (S f) is introduced to the control limits to improve the sensitivity in detecting small shifts. The model was embedded in automated quality control system and verified with a case study in real industry.

  18. Development of graphene process control by industrial optical spectroscopy setup

    Science.gov (United States)

    Fursenko, O.; Lukosius, M.; Lupina, G.; Bauer, J.; Villringer, C.; Mai, A.

    2017-06-01

    The successful integration of graphene into microelectronic devices depends strongly on the availability of fast and nondestructive characterization methods of graphene grown by CVD on large diameter production wafers [1-3] which are in the interest of the semiconductor industry. Here, a high-throughput optical metrology method for measuring the thickness and uniformity of large-area graphene sheets is demonstrated. The method is based on the combination of spectroscopic ellipsometry and normal incidence reflectometry in UV-Vis wavelength range (200-800 nm) with small light spots ( 30 μm2) realized in wafer optical metrology tool. In the first step graphene layers were transferred on a SiO2/Si substrate in order to determine the optical constants of graphene by the combination of multi-angle ellipsometry and reflectometry. Then these data were used for the development of a process control recipe of CVD graphene on 200 mm Ge(100)/Si(100) wafers. The graphene layer quality was additionally monitored by Raman spectroscopy. Atomic force microscopy measurements were performed for micro topography evaluation. In consequence, a robust recipe for unambiguous thickness monitoring of all components of a multilayer film stack, including graphene, surface residuals or interface layer underneath graphene and surface roughness is developed. Optical monitoring of graphene thickness uniformity over a wafer has shown an excellent long term stability (s=0.004 nm) regardless of the growth of interfacial GeO2 and surface roughness. The sensitivity of the optical identification of graphene during microelectronic processing was evaluated. This optical metrology technique with combined data collection exhibit a fast and highly precise method allowing one an unambiguous detection of graphene after transferring as well as after the CVD deposition process on a Ge(100)/Si(100) wafer. This approach is well suited for industrial applications due to its repeatability and flexibility.

  19. System for measurements and data processing in neutron physics researches

    International Nuclear Information System (INIS)

    Kadashevich, V.I.; Kondurov, I.A.; Nikolaev, S.N.; Ryabov, Yu.F.

    1976-01-01

    A system of measuring and computing means created for automation of studies in the field of the neutron physics is discussed. Within the framework of this system each experiment is provided with its individual measuring station which consists of a set of analog and digital modules implemented in accordance with the CAMAC standard. On the higher level of this system there are measuring-computing centres (MCC) which simultaneously serve a number of physical installations. These MCCs are based on ''Minsk-22'' computers whose computational facilities are used for the preliminary processing and for creation of temporary data archives. In its turn, all the MCCs are users of the time-sharing system on the basis of the ''Minsk-32'' computers. This system extends possibilities for user's fast data processing, archive creation and provides transfer of required information to the main computing system based on the BESM-6 computer. Transfer of information and preliminary processing are performed by remote terminals with the help of a special directive language

  20. 75 FR 29513 - Developing a Supplemental Poverty Measure

    Science.gov (United States)

    2010-05-26

    ... Supplemental Poverty Measure AGENCY: Bureau of the Census, Department of Commerce. ACTION: Notice and... comments on the approach to developing a Supplemental Poverty Measure (SPM) presented in a report entitled ``Observations from the Interagency Technical Working Group on Developing a Supplemental Poverty Measure,'' which...

  1. Working memory training improves reading processes in typically developing children.

    Science.gov (United States)

    Loosli, Sandra V; Buschkuehl, Martin; Perrig, Walter J; Jaeggi, Susanne M

    2012-01-01

    The goal of this study was to investigate whether a brief cognitive training intervention results in a specific performance increase in the trained task, and whether there are transfer effects to other nontrained measures. A computerized, adaptive working memory intervention was conducted with 9- to 11-year-old typically developing children. The children considerably improved their performance in the trained working memory task. Additionally, compared to a matched control group, the experimental group significantly enhanced their reading performance after training, providing further evidence for shared processes between working memory and reading.

  2. Processes of Personality Development in Adulthood: The TESSERA Framework.

    Science.gov (United States)

    Wrzus, Cornelia; Roberts, Brent W

    2017-08-01

    The current article presents a theoretical framework of the short- and long-term processes underlying personality development throughout adulthood. The newly developed TESSERA framework posits that long-term personality development occurs due to repeated short-term, situational processes. These short-term processes can be generalized as recursive sequence of Triggering situations, Expectancy, States/State expressions, and Reactions (TESSERA). Reflective and associative processes on TESSERA sequences can lead to personality development (i.e., continuity and lasting changes in explicit and implicit personality characteristics and behavioral patterns). We illustrate how the TESSERA framework facilitates a more comprehensive understanding of normative and differential personality development at various ages during the life span. The TESSERA framework extends previous theories by explicitly linking short- and long-term processes of personality development, by addressing different manifestations of personality, and by being applicable to different personality characteristics, for example, behavioral traits, motivational orientations, or life narratives.

  3. Development of a 2D temperature measurement technique for combustion diagnostics using 2-line atomic fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Engstroem, Johan

    2001-01-01

    The present thesis is concerned with the development and application of a novel planar laser-induced fluorescence (PLIF) technique for temperature measurements in a variety of combusting flows. Accurate measurement of temperature is an essential task in combustion diagnostics, since temperature is one of the most fundamental quantities for the characterization of combustion processes. The technique is based on two-line atomic fluorescence (TLAF) from small quantities of atomic indium (In) seeded into the fuel. It has been developed from small-scale experiments in laboratory flames to the point where practical combustion systems can be studied. The technique is conceptually simple and reveals temperature information in the post-flame regions. The viability of the technique has been tested in three extreme measurement situations: in spark ignition engine combustion, in ultra-lean combustion situations such as lean burning aero-engine concepts and, finally, in fuel-rich combustion. TLAF was successfully applied in an optical Sl engine using isooctane as fuel. The wide temperature sensitivity, 700 - 3000 K, of the technique using indium atoms allowed measurements over the entire combustion cycle in the engine to be performed. In applications in lean combustion a potential problem caused by the strong oxidation processes of indium atoms was encountered. This limits measurement times due to deposits of absorbing indium oxide on measurement windows. The seeding requirement is a disadvantage of the technique and can be a limitation in some applications. The results from experiments performed in sooting flames are very promising for thermometry measurements in such environments. Absorption by hydrocarbons and other native species was found to be negligible. Since low laser energies and low seeding concentrations could be used, the technique did not, unlike most other incoherent optical thermometry techniques, suffer interferences from LII of soot particles or LIF from PAH

  4. Development of an atmospheric 214Bi measuring instrument

    International Nuclear Information System (INIS)

    1975-01-01

    Part of the radiation environment encountered during airborne gamma ray surveys is produced by 214 Bi existing in the atmosphere. The 214 Bi atmospheric concentration changes with time and location, and should be measured to process the acquired data correctly. Three methods of atmospheric 214 Bi measurement are evaluated in this work. These are: (1) an 11 1 / 2 '' dia. x 4'' thick NaI(Tl) crystal shielded from ground radiation, (2) a negatively charged wire to collect radioactive ions, and (3) a high volume air sampler collecting particulate matter on filter paper. The shielded detector and filter paper methods yield good results with the shielded detector producing a factor of about 10 times higher counting rate. The charged wire method gave very low counting rates where the shielded detector counting rates were about a factor of 100 times higher, and the results did not correlate with the 214 Bi atmospheric concentration as determined by the other two methods. The theory necessary to understand the collection and decay of the airborne radioactivity using the charged wire and filter paper methods is developed

  5. Some Comments on Quasi-Birth-and-Death Processes and Matrix Measures

    Directory of Open Access Journals (Sweden)

    Holger Dette

    2010-01-01

    Full Text Available We explore the relation between matrix measures and quasi-birth-and-death processes. We derive an integral representation of the transition function in terms of a matrix-valued spectral measure and corresponding orthogonal matrix polynomials. We characterize several stochastic properties of quasi-birth-and-death processes by means of this matrixmeasure and illustrate the theoretical results by several examples.

  6. Development of a method for measuring femoral torsion using real-time ultrasound

    International Nuclear Information System (INIS)

    Hafiz, Eliza; Hiller, Claire E; Nightingale, E Jean; Eisenhuth, John P; Refshauge, Kathryn M; Nicholson, Leslie L; Clarke, Jillian L; Grimaldi, Alison

    2014-01-01

    Excessive femoral torsion has been associated with various musculoskeletal and neurological problems. To explore this relationship, it is essential to be able to measure femoral torsion in the clinic accurately. Computerized tomography (CT) and magnetic resonance imaging (MRI) are thought to provide the most accurate measurements but CT involves significant radiation exposure and MRI is expensive. The aim of this study was to design a method for measuring femoral torsion in the clinic, and to determine the reliability of this method. Details of design process, including construction of a jig, the protocol developed and the reliability of the method are presented. The protocol developed used ultrasound to image a ridge on the greater trochanter, and a customized jig placed on the femoral condyles as reference points. An inclinometer attached to the customized jig allowed quantification of the degree of femoral torsion. Measurements taken with this protocol had excellent intra- and inter-rater reliability (ICC 2,1  = 0.98 and 0.97, respectively). This method of measuring femoral torsion also permitted measurement of femoral torsion with a high degree of accuracy. This method is applicable to the research setting and, with minor adjustments, will be applicable to the clinical setting. (paper)

  7. Exploring Approaches How to Measure a Lean Process

    Directory of Open Access Journals (Sweden)

    Österman Christer

    2014-08-01

    Full Text Available Purpose:The purpose of the research is to explore a practical method of measuring the implementation of lean in a process. The method will be based on examining the abilities of a group. At this scale the ability to work standardized and solve problems is important. These two abilities are dependent of each other and are fundamental for the group's ability to create a stable result. In this context the method of standardized work (SW is define to be the methods used in a process to generate stable results. Problem solving (PS is defined as the methods used to return a process to a condition where SW is possible.

  8. Development of nondestructive measurement system for quantifying radioactivity from crud, liquids and gases in a contaminated pipe

    Energy Technology Data Exchange (ETDEWEB)

    Katagiri, Masaki; Ito, Hirokuni; Wakayama, Naoaki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-11-01

    A nondestructive measuring method was developed to quantify separately radioisotope concentrations of crud, liquids and gases in a contaminated pipe. For applying this method to practical in-situ measurement, a nondestructive measurement system was developed. The measurement system consists of an in-situ equipment for gamma-ray scanning measurements and a data-processing equipment for analysis of radioactivity. The communication between both equipments is performed by a wireless telemeter device. To construct the measurement system, a gas-cooled Ge detector of practical use, small-sized electronics circuits, a fast and reliable telemeter device and automatic measurement technics using a computer were developed. Through performance tests, it is confirmed that the measurement system is effective for in-situ measurements of radioactivity in a contaminated pipe. The measurement accuracy with this measurement system is 10 - 20 %, which was determined by comparison with solid and liquid radioisotope concentrations in a mock-up contaminated pipe that had been quantified in advance. (author).

  9. Instrumental development and data processing

    International Nuclear Information System (INIS)

    Franzen, J.

    1978-01-01

    A review of recent developments in mass spectrometry instrumentation is presented under the following headings: introduction (scope of mass spectrometry compared with neighbouring fields); ion sources and ionization techniques; spectrometers (instrumental developments); measuring procedures; coupling techniques; data systems; conclusions (that mass spectrometry should have a broader basis and that there would be mutual profit from a better penetration of mass spectrometry into fields of routine application). (U.K.)

  10. Processing horizontal networks measured by integrated terrestrial and GPS technologies

    Directory of Open Access Journals (Sweden)

    Vincent Jakub

    2003-09-01

    Full Text Available Local horizontal networks in which GPS and terrestrial measurements (TER are done are often established at present. Iin other networks, the previous terrestrial measurements can be completed with quantities from contemporary GPS observations (tunnel nets, mining nets with surface and underground parts and other long-shaped nets.The processing of such heterobeneous (GPS, TER networks whose terrestrial measurements are performed as point coordinate measurements (∆X, ∆Y using (geodetic total stationIn is presented in this paper. In such network structures it is then available:- the values ∆X, ∆Y from TER observations which are transformed in the plane of S-JTSK for adjustement,- the values ∆X, ∆Y in the plane S-JTSK that can be obtained by 3D transformation of WGS84 netpoint coordinates from GPS observations to corresponding coordinates S-JTSK.For common adjusting all the ∆X, ∆Y, some elements of the network geometry (e.g. distances should be measured by both methods (GPS, TER. This approach makes possible an effective homogenisation of both network parts what is equivalent to saying that an expressive influence reduction on local frame realizations of S-JTSK in the whole network can be made.Results of network processing obtained in proposed manner are acceptable in general and they are equivalent (accuracy, reliability to results of another processing methods.

  11. Development of a National M and O Contractor Work Prioritisation Process and its Use as a Progress Measure for Nuclear Clean Up in the United Kingdom

    International Nuclear Information System (INIS)

    Waite, R.; Hudson, I.D.; Wareing, M.I.

    2006-01-01

    In July 2004, Her Majesty's Government established a Nuclear Decommissioning Authority (NDA) to assume responsibility for the discharge of the vast majority of the United Kingdom's public sector civil nuclear liabilities. The Energy Act of 2004 outlines in greater detail how the NDA functions, what its responsibilities are, and how these fit into the overall structure of the UK programme for managing and disposing of the liabilities created by a significant element of the UK's early commercial and nuclear weapons activities. The amount of Government funding provided to the NDA will be a key factor in determining what can be achieved. In agreeing how the funds are distributed to the licensed sites, the NDA will need to keep in mind the 'guiding principles' stated in 'Managing the Nuclear Legacy - A Strategy for Action': - Focus on getting the job done to high safety, security and environmental standards; - Best value for money consistent with safety, security and environmental performance; - Openness and transparency. To satisfy these requirements there is a need for a transparent process for justifying and prioritising work that aids decisions about what should be done and when, is straightforward to understand and can be applied by a wide range of stakeholders. To develop such a process, a multi-stakeholder group (the 'Prioritisation Working Group') produced a report published in April 2005 that examined how the process would align with the NDA's overall management processes. It also identified six criteria or 'attributes' that should be taken into account, and a variety of measures, or 'metrics' that could be used to assess each attribute. The report formed the basis of preliminary guidance from NDA to the site licensees that was used to guide their submissions on plans and programmes of work in 2005. Since this report the NDA has been working, with stakeholder input, to develop a prioritisation process to be used during the production of future Life Cycle

  12. Measurements of oil spill spreading in a wave tank using digital image processing

    International Nuclear Information System (INIS)

    Flores, H.; Saavedra, I.; Andreatta, A.; Llona, G.

    1998-01-01

    In this work, an experimental study of spreading of crude oil is carried out in a wave tank. The tests are performed by spilling different volumes and types of crude oil on the water surface. An experimental measurement technique was developed based on digital processing of video images. The acquisition and processing of such images is carried out by using a video camera and inexpensive microcomputer hardware and software. Processing is carried out by first performing a digital image filter, then edge detection is performed on the filtered image data. The final result is a file that contains the coordinates of a polygon that encloses the observed slick for each time step. Different types of filters are actually used in order to adequately separate the color intensifies corresponding to each of the elements in the image. Postprocessing of the vectorized images provides accurate measurements of the slick edge, thus obtaining a complete geometric representation, which is significantly different from simplified considerations of radially symmetric spreading. The spreading of the oil slick was recorded for each of the tests. Results of the experimental study are presented for each spreading regime, and analyzed in terms of the wave parameters such as period and wave height. (author)

  13. A Hybrid Fuzzy Model for Lean Product Development Performance Measurement

    Science.gov (United States)

    Osezua Aikhuele, Daniel; Mohd Turan, Faiz

    2016-02-01

    In the effort for manufacturing companies to meet up with the emerging consumer demands for mass customized products, many are turning to the application of lean in their product development process, and this is gradually moving from being a competitive advantage to a necessity. However, due to lack of clear understanding of the lean performance measurements, many of these companies are unable to implement and fully integrated the lean principle into their product development process. Extensive literature shows that only few studies have focus systematically on the lean product development performance (LPDP) evaluation. In order to fill this gap, the study therefore proposed a novel hybrid model based on Fuzzy Reasoning Approach (FRA), and the extension of Fuzzy-AHP and Fuzzy-TOPSIS methods for the assessment of the LPDP. Unlike the existing methods, the model considers the importance weight of each of the decision makers (Experts) since the performance criteria/attributes are required to be rated, and these experts have different level of expertise. The rating is done using a new fuzzy Likert rating scale (membership-scale) which is designed such that it can address problems resulting from information lost/distortion due to closed-form scaling and the ordinal nature of the existing Likert scale.

  14. Research and development of safeguards measures for the large scale reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Masahiro; Sato, Yuji; Yokota, Yasuhiro; Masuda, Shoichiro; Kobayashi, Isao; Uchikoshi, Seiji; Tsutaki, Yasuhiro; Nidaira, Kazuo [Nuclear Material Control Center, Tokyo (Japan)

    1994-12-31

    The Government of Japan agreed on the safeguards concepts of commercial size reprocessing plant under the bilateral agreement for cooperation between the Japan and the United States. In addition, the LASCAR, that is the forum of large scale reprocessing plant safeguards, could obtain the fruitful results in the spring of 1992. The research and development of safeguards measures for the Rokkasho Reprocessing Plant should be progressed with every regard to the concepts described in both documents. Basically, the material accountancy and monitoring system should be established, based on the NRTA and other measures in order to obtain the timeliness goal for plutonium, and the un-attended mode inspection approach based on the integrated containment/surveillance system coupled with radiation monitoring in order to reduce the inspection efforts. NMCC has been studying on the following measures for a large scale reprocessing plant safeguards (1) A radiation gate monitor and integrated surveillance system (2) A near real time Shipper and Receiver Difference monitoring (3) A near real time material accountancy system operated for the bulk handling area (4) A volume measurement technique in a large scale input accountancy vessel (5) An in-process inventory estimation technique applied to the process equipment such as the pulse column and evaporator (6) Solution transfer monitoring approach applied to buffer tanks in the chemical process (7) A timely analysis technique such as a hybrid K edge densitometer operated in the on-site laboratory (J.P.N.).

  15. Development of control and data processing system for CO2 laser interferometer

    International Nuclear Information System (INIS)

    Chiba, Shinichi; Kawano, Yasunori; Tsuchiya, Katsuhiko; Inoue, Akira

    2001-11-01

    CO 2 laser interferometer diagnostic has been operating to measure the central electron density in JT-60U plasmas. We have developed a control and data processing system for the CO 2 laser interferometer with flexible functions of data acquisition, data processing and data transfer in accordance with the sequence of JT-60U discharges. This system is mainly composed of two UNIX workstations and CAMAC clusters, in which the high reliability was obtained by sharing the data process functions to the each workstations. Consequently, the control and data processing system becomes to be able to provide electron density data immediately after a JT-60U discharge, routinely. The realtime feedback control of electron density in JT-60U also becomes to be available by using a reference density signal from the CO 2 laser interferometer. (author)

  16. Velocity Mapping Toolbox (VMT): a processing and visualization suite for moving-vessel ADCP measurements

    Science.gov (United States)

    Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.

    2013-01-01

    The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.

  17. Determination of two-liquid mixture composition by assessing its dielectric parameters 2. modified measuring system for monitoring the dehydration process of bioethanol production

    Directory of Open Access Journals (Sweden)

    Vilitis O.

    2014-02-01

    Full Text Available In Part 2 of the work we describe a modified measuring system for precise monitoring of the dehydration process of bioethanol production. This is based on the earlier proposed system for measuring the concentration of solutions and two-liquid mixtures using devices with capacitive sensors (1-300pF, which provides a stable measuring resolution of ± 0.005 pF at measuring the capacitance of a sensor. In this part of our work we determine additional requirements that are to be imposed on the measuring system at monitoring the ethanol dehydration process and control of bioethanol production. The most important parameters of the developed measuring system are identified. An exemplary calculation is given for the thermocompensated calibration of measuring devices. The results of tests have shown a good performance of the developed measuring system.

  18. Complex permittivity measurement at millimetre-wave frequencies during the fermentation process of Japanese sake

    International Nuclear Information System (INIS)

    Kouzai, Masaki; Nishikata, Atsuhiro; Fukunaga, Kaori; Miyaoka, Shunsuke

    2007-01-01

    Various chemical reactions occur simultaneously in barrels during the fermentation processes of alcoholic beverages. Chemical analyses are employed to monitor the change in chemical components, such as glucose and ethyl alcohol. The tests are carried out with extracted specimens, are costly and require time. We have developed a permittivity measurement system for liquid specimens in the frequency range from 2.6 to 50 GHz, and applied the system to fermentation monitoring. Experimental results proved that the observed change in complex permittivity suggests a decrease in the amount of glucose and an increase in alcohol content, which are the key chemical components during the fermentation process

  19. Patterns of Software Development Process

    Directory of Open Access Journals (Sweden)

    Sandro Javier Bolaños Castro

    2011-12-01

    Full Text Available "Times New Roman","serif";mso-fareast-font-family:"Times New Roman";mso-ansi-language:EN-US;mso-fareast-language:EN-US;mso-bidi-language:AR-SA">This article presents a set of patterns that can be found to perform best practices in software processes that are directly related to the problem of implementing the activities of the process, the roles involved, the knowledge generated and the inputs and outputs belonging to the process. In this work, a definition of the architecture is encouraged by using different recurrent configurations that strengthen the process and yield efficient results for the development of a software project. The patterns presented constitute a catalog, which serves as a vocabulary for communication among project participants [1], [2], and also can be implemented through software tools, thus facilitating patterns implementation [3]. Additionally, a tool that can be obtained under GPL (General Public license is provided for this purpose

  20. Development of a measure of asthma-specific quality of life among adults.

    Science.gov (United States)

    Eberhart, Nicole K; Sherbourne, Cathy D; Edelen, Maria Orlando; Stucky, Brian D; Sin, Nancy L; Lara, Marielena

    2014-04-01

    A key goal in asthma treatment is improvement in quality of life (QoL), but existing measures often confound QoL with symptoms and functional impairment. The current study addresses these limitations and the need for valid patient-reported outcome measures by using state-of-the-art methods to develop an item bank assessing QoL in adults with asthma. This article describes the process for developing an initial item pool for field testing. Five focus group interviews were conducted with a total of 50 asthmatic adults. We used "pile sorting/binning" and "winnowing" methods to identify key QoL dimensions and develop a pool of items based on statements made in the focus group interviews. We then conducted a literature review and consulted with an expert panel to ensure that no key concepts were omitted. Finally, we conducted individual cognitive interviews to ensure that items were well understood and inform final item refinement. Six hundred and sixty-one QoL statements were identified from focus group interview transcripts and subsequently used to generate a pool of 112 items in 16 different content areas. Items covering a broad range of content were developed that can serve as a valid gauge of individuals' perceptions of the effects of asthma and its treatment on their lives. These items do not directly measure symptoms or functional impairment, yet they include a broader range of content than most existent measures of asthma-specific QoL.

  1. Environmental opportunities questionnaire: development of a measure of the environment supporting early motor development in the first year of life.

    Science.gov (United States)

    Doralp, Samantha; Bartlett, Doreen J

    2013-09-01

    The development and testing of a measure evaluating the quality and variability in the home environment as it relates to the motor development of infants during the first year of life. A sample of 112 boys and 95 girls with a mean age of 7.1 months (SD 1.8) and GA of 39.6 weeks (SD 1.5) participated in the study. The measurement development process was divided into three phases: measurement development (item generation or selection of items from existing measurement tools), pilot testing to determine acceptability and feasibility to parents, and exploratory factor analysis to organize items into meaningful concepts. Test-retest reliability and internal consistency were also determined. The environmental opportunities questionnaire (EOQ) is a feasible 21-item measure comprised of three factors including opportunities in the play space, sensory variety and parental encouragement. Overall, test-retest reliability was 0.92 (CI 0.84-0.96) and the internal consistency is 0.79. The EOQ emphasizes quality of the environment and access to equipment and toys that have the potential to facilitate early motor development. The preliminary analyses reported here suggest more work could be done on the EOQ to strengthen its use for research or clinical purposes; however, it is adequate for use in its current form. Implications for Rehabilitation New and feasible 21-item questionnaire that enables identification of malleable environmental factors that serve as potential points of intervention for children that are not developing typically. Therapeutic tool for use by therapists to inform and guide discussions with caregivers about potential influences of environmental, social and attitudinal factors in their child's early development.

  2. [Research on the range of motion measurement system for spine based on LabVIEW image processing technology].

    Science.gov (United States)

    Li, Xiaofang; Deng, Linhong; Lu, Hu; He, Bin

    2014-08-01

    A measurement system based on the image processing technology and developed by LabVIEW was designed to quickly obtain the range of motion (ROM) of spine. NI-Vision module was used to pre-process the original images and calculate the angles of marked needles in order to get ROM data. Six human cadaveric thoracic spine segments T7-T10 were selected to carry out 6 kinds of loads, including left/right lateral bending, flexion, extension, cis/counterclockwise torsion. The system was used to measure the ROM of segment T8-T9 under the loads from 1 Nm to 5 Nm. The experimental results showed that the system is able to measure the ROM of the spine accurately and quickly, which provides a simple and reliable tool for spine biomechanics investigators.

  3. Information paths within the new product development process

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    2007-01-01

    collection platform to obtain measurements from within the NPD process. 42 large, international companies participated in the data collecting simulation. Results revealed five different information paths that were not connecting all stages of the NPD process. Moreover, results show that the front......-end is not driving the information acquisition through the stages of the NPD process, and that environmental turbulence disconnects stages from the information paths in the NPD process. This implies that information is at the same time a key to success and a key to entrapment in the NPD process....

  4. Development of position measurement unit for flying inertial fusion energy target

    International Nuclear Information System (INIS)

    Tsuji, R; Endo, T; Yoshida, H; Norimatsu, T

    2016-01-01

    We have reported the present status in the development of a position measurement unit (PMU) for a flying inertial fusion energy (IFE) target. The PMU, which uses Arago spot phenomena, is designed to have a measurement accuracy smaller than 1 μm. By employing divergent, pulsed orthogonal laser beam illumination, we can measure the time and the target position at the pulsed illumination. The two-dimensional Arago spot image is compressed into one-dimensional image by a cylindrical lens for real-time processing. The PMU are set along the injection path of the flying target. The local positions of the target in each PMU are transferred to the controller and analysed to calculate the target trajectory. Two methods are presented to calculate the arrival time and the arrival position of the target at the reactor centre. (paper)

  5. Development of position measurement unit for flying inertial fusion energy target

    Science.gov (United States)

    Tsuji, R.; Endo, T.; Yoshida, H.; Norimatsu, T.

    2016-03-01

    We have reported the present status in the development of a position measurement unit (PMU) for a flying inertial fusion energy (IFE) target. The PMU, which uses Arago spot phenomena, is designed to have a measurement accuracy smaller than 1 μm. By employing divergent, pulsed orthogonal laser beam illumination, we can measure the time and the target position at the pulsed illumination. The two-dimensional Arago spot image is compressed into one-dimensional image by a cylindrical lens for real-time processing. The PMU are set along the injection path of the flying target. The local positions of the target in each PMU are transferred to the controller and analysed to calculate the target trajectory. Two methods are presented to calculate the arrival time and the arrival position of the target at the reactor centre.

  6. Tracer measurements compared to process data reconciliation in accordance with VDI 2048

    International Nuclear Information System (INIS)

    Hungerbuehler, Thomas; Langenstein, Magnus

    2007-01-01

    The feed water mass flow is the key measured variable used to determine the thermal reactor output in a nuclear power plant. Usually this parameter is recorded via venturi nozzles or orifice plates. The problem with both principles of measurement, however, is that an accuracy of below 1% cannot be reached. In the case of nuclear power plants and depending on the size of the plant, this corresponds to an electrical output of 4 MWel to 16 MWel. In order to make more accurate statements about the feed water amounts recirculated in the water-steam circuit, tracer measurements that offer an accuracy of up to 0.2% are used. A drawback of this method is that this measuring principle is suitable only for providing an instantaneous picture but does not provide continuous operating information about the feed water mass flow. Process data reconciliation based on VDI 2048 is a mathematical-statistic process that makes use of redundant process information. The uncertainty of reconciled feed water flow rates and the thermal reactor output calculated on this basis can be reduced to 0.4%. The overall process monitored continuously in this manner therefore provides hourly process information of a quality equal to that obtained with acceptance measurements. In the NPP Beznau both methods have been used in parallel to determine the feed water flow rates in 2004 (unit 1) and 2005 (unit 2). Comparison of the results shows that a high level of agreement is obtained between the results of the reconciliation and the results of the tracer measurements. For this reason it was decided that no future tracer measurements will be conducted anymore. A result of the findings of this comparison, a high level of acceptance of process data reconciliation based on VDI 2048 was achieved. (author)

  7. Splitting: The Development of a Measure.

    Science.gov (United States)

    Gerson, Mary-Joan

    1984-01-01

    Described the development of a scale that measures splitting as a psychological structure. The construct validity of the splitting scale is suggested by the positive relationship between splitting scores and a diagnostic measure of the narcissistic personality disorder, as well as a negative relationship between splitting scores and levels of…

  8. How does the scientific progress in developing countries affect bibliometric impact measures of developed countries? A counterfactual case study on China

    Energy Technology Data Exchange (ETDEWEB)

    Stahlschmidt, S.; Hinze, S.

    2016-07-01

    Many developed countries have found their bibliometric impact measures to be improving over the last decade. Also the BRICS states, the economically largest group of developing countries, observe a similar pattern. This uniform growth seems puzzling, as not every country can improve its relative performance to all other countries. A possible explanation for this uniform growth might be found in the dynamic environment and especially in the exponential growth of Chinese publications. We like to analyze how this unprecedented growth of contributions from a single country with its specific bibliometric characteristics affects the whole bibliometric measurement process. We show that due to the lowly cited Chinese publications the overall corpus of scientific publications grows especially in the lower tail and argue that this unequal increase in publications benefits especially the bibliometric impact measures of developed countries. The actual magnitude of this effect will be derived by contrasting the actual bibliometric world with a counterfactual one without China. (Author)

  9. Measured spatial variability of beach erosion due to aeolian processes.

    NARCIS (Netherlands)

    de Vries, S.; Verheijen, A.H.; Hoonhout, B.M.; Vos, S.E.; Cohn, Nicholas; Ruggiero, P; Aagaard, T.; Deigaard, R.; Fuhrman, D.

    2017-01-01

    This paper shows the first results of measured spatial variability of beach erosion due to aeolian processes during the recently conducted SEDEX2 field experiment at Long Beach, Washington, U.S.A.. Beach erosion and sedimentation were derived using series of detailed terrestrial LIDAR measurements

  10. A neuroconstructivist model of past tense development and processing.

    Science.gov (United States)

    Westermann, Gert; Ruh, Nicolas

    2012-07-01

    We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated account of characteristic errors during learning the past tense, adult generalization to pseudoverbs, and dissociations between verbs observed after brain damage in aphasic patients. We put forward a theory of verb inflection in which a functional processing architecture develops through interactions between experience-dependent brain development and the structure of the environment, in this case, the statistical properties of verbs in the language. The outcome of this process is a structured processing system giving rise to graded dissociations between verbs that are easy and verbs that are hard to learn and process. In contrast to dual-mechanism accounts of inflection, we argue that describing dissociations as a dichotomy between regular and irregular verbs is a post hoc abstraction and is not linked to underlying processing mechanisms. We extend current single-mechanism accounts of inflection by highlighting the role of structural adaptation in development and in the formation of the adult processing system. In contrast to some single-mechanism accounts, we argue that the link between irregular inflection and verb semantics is not causal and that existing data can be explained on the basis of phonological representations alone. This work highlights the benefit of taking brain development seriously in theories of cognitive development. Copyright 2012 APA, all rights reserved.

  11. Cognitive Process of Development in Children

    Science.gov (United States)

    Boddington, Eulalee N.

    2009-01-01

    In this article we explored the theories of Arnold Gesell, Erik Erickson and Jean Piaget about how human beings development. In this component we will analyze the cognitive processes of how children perceive and develop, in particular children from a cross-cultural background. How learning takes place, and how the influences of culture, and…

  12. MCO closure welding process parameter development and qualification

    International Nuclear Information System (INIS)

    CANNELL, G.R.

    2003-01-01

    One of the key elements in the SNF process is final closure of the MCO by welding. Fuel is loaded into the MCO (approximately 2 ft. in diameter and 13 ft. long) and a heavy shield plug is inserted into the top, creating a mechanical seal. The plug contains several process ports for various operations, including vacuum drying and inert-gas backfilling of the packaged fuel. When fully processed, the Canister Cover Assembly (CCA) is placed over the shield plug and final closure made by welding. The following reports the effort between the Amer Industrial Technology (AIT) and Fluor Hanford (FH) to develop and qualify the welding process for making the final closure--with primary emphasis on developing a set of robust parameters for deposition of the root pass. Work was carried out in three phases: (1) Initial welding process and equipment selection with subsequent field demonstration testing; (2) Development and qualification of a specific process technique and parameters; and (3) Validation of the process and parameters at the CSB under mock production conditions. This work establishes the process technique and parameters that provide a high level of confidence that acceptable MCO closure welds will be made on a consistent and repeatable basis

  13. Sustaining Innovation: Developing an Instructional Technology Assessment Process

    Science.gov (United States)

    Carmo, Monica Cristina

    2013-01-01

    This case study developed an instructional technology assessment process for the Gevirtz Graduate School of Education (GGSE). The theoretical framework of Adelman and Taylor (2001) guided the development of this instructional technology assessment process and the tools to aid in its facilitation. GGSE faculty, staff, and graduate students…

  14. Development of measures to evaluate youth advocacy for obesity prevention.

    Science.gov (United States)

    Millstein, Rachel A; Woodruff, Susan I; Linton, Leslie S; Edwards, Christine C; Sallis, James F

    2016-07-26

    Youth advocacy has been successfully used in substance use prevention but is a novel strategy in obesity prevention. As a precondition for building an evidence base for youth advocacy for obesity prevention, the present study aimed to develop and evaluate measures of youth advocacy mediator, process, and outcome variables. The Youth Engagement and Action for Health (YEAH!) program (San Diego County, CA) engaged youth and adult group leaders in advocacy for school and neighborhood improvements to nutrition and physical activity environments. Based on a model of youth advocacy, scales were developed to assess mediators, intervention processes, and proximal outcomes of youth advocacy for obesity prevention. Youth (baseline n = 136) and adult group leaders (baseline n = 47) completed surveys before and after advocacy projects. With baseline data, we created youth advocacy and adult leadership subscales using confirmatory factor analysis (CFA) and described their psychometric properties. Youth came from 21 groups, were ages 9-22, and most were female. Most youth were non-White, and the largest ethnic group was Hispanic/Latino (35.6%). The proposed factor structure held for most (14/20 youth and 1/2 adult) subscales. Modifications were necessary for 6 of the originally proposed 20 youth and 1 of the 2 adult multi-item subscales, which involved splitting larger subscales into two components and dropping low-performing items. Internally consistent scales to assess mediators, intervention processes, and proximal outcomes of youth advocacy for obesity prevention were developed. The resulting scales can be used in future studies to evaluate youth advocacy programs.

  15. Gravity measurement, processing and evaluation: Test cases de Peel and South Limburg

    Science.gov (United States)

    Nohlmans, Ron

    1990-05-01

    A general overview of the process of the measurement and the adjustment of a gravity network and the computation of some output parameters of gravimetry, gravity values, gravity anomalies and mean block anomalies, is given. An overview of developments in gravimetry, globally and in the Netherlands, until now is given. The basic theory of relative gravity measurements is studied and a description of the most commonly used instrument, the LaCoste and Romberg gravimeter is given. The surveys done in the scope of this study are descibed. A more detailed impression of the adjustment procedure and the results of the adjustment are given. A closer look is taken at the more geophysical side of gravimetry: gravity reduction, the computation of anomalies and the correlation with elevation. The interpolation of gravity and the covariance of gravity anomalies are addressed.

  16. Development of liquid film thickness measurement technique by high-density multipoint electrodes method

    International Nuclear Information System (INIS)

    Arai, Takahiro; Furuya, Masahiro; Kanai, Taizo

    2010-01-01

    High-density multipoint electrode method was developed to measure a liquid film thickness transient on a curved surface. The devised method allows us to measure spatial distribution of liquid film with its conductance between electrodes. The sensor was designed and fabricated as a multilayer print circuit board, where electrode pairs were distributed in reticular pattern with narrow interval. In order to measure a lot of electrode pairs at a high sampling rate, signal-processing method used by the wire mesh sensor measurement system was applied. An electrochemical impedance spectrometry concludes that the sampling rate of 1000 slices/s is feasible without signal distortion by electric double layer. The method was validated with two experimental campaigns: (1) a droplet impingement on a flat film and (2) a jet impingement on a rod-shape sensor surface. In the former experiment, a water droplet having 4 mm in diameter impinged onto the 1 mm thick film layer. A visual observation study with high-speed video camera shows after the liquid impingement, the water layer thinning process was clearly demonstrated with the sensor. For the latter experiment, the flexible circuit board was bended to form a cylindrical shape to measure water film on a simulated fuel rod in bundle geometry. A water jet having 3 mm in diameter impinged onto the rod-shape sensor surface. The process of wetting area enlargement on the rod surface was demonstrated in the same manner that the video-frames showed. (author)

  17. A perspective on PSE in pharmaceutical process development and innovation

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2012-01-01

    The pharmaceutical industry is under growing pressure to increase efficiency, both in production and in process development. This paper discusses the central role of Process Systems Engineering (PSE) methods and tools in pharmaceutical process development and innovation, and searches for answers...... to questions such as: Which PSE methods can be applied readily? Where is more method development needed? The paper covers key subjects for development of economically and environmentally sustainable pharmaceutical processes, including Process Analytical Technology in its broadest sense, continuous...... pharmaceutical manufacturing and green processes, and is illustrated with a series of short examples taken from the literature and ongoing research projects....

  18. Developing a practice guideline for the occupational health services by using a community of practice approach: a process evaluation of the development process

    Directory of Open Access Journals (Sweden)

    Lydia Kwak

    2017-01-01

    Full Text Available Abstract Background One way to facilitate the translation of research into the occupational health service practice is through clinical practice guidelines. To increase the implementability of guidelines it is important to include the end-users in the development, for example by a community of practice approach. This paper describes the development of an occupational health practice guideline aimed at the management of non-specific low back pain (LBP by using a community of practice approach. The paper also includes a process evaluation of the development providing insight into the feasibility of the process. Methods A multidisciplinary community of practice group (n = 16 consisting of occupational nurses, occupational physicians, ergonomists/physical therapists, health and safety engineers, health educators, psychologists and researchers from different types of occupational health services and geographical regions within Sweden met eleven times (June 2012–December 2013 to develop the practice guideline following recommendations of guideline development handbooks. Process-outcomes recruitment, reach, context, satisfaction, feasibility and fidelity were assessed by questionnaire, observations and administrative data. Results Group members attended on average 7.5 out of 11 meetings. Half experienced support from their workplace for their involvement. Feasibility was rated as good, except for time-scheduling. Most group members were satisfied with the structure of the process (e.g. presentations, multidisciplinary group. Fidelity was rated as fairly high. Conclusions The described development process is a feasible process for guideline development. For future guideline development expectations of the work involved should be more clearly communicated, as well as the purpose and tasks of the CoP-group. Moreover, possibilities to improve support from managers and colleagues should be explored. This paper has important implications for future

  19. Development and Use of a Goal Setting/Attainment Process Designed To Measure a Teacher's Ability To Engage in Professional Growth and Leadership Initiatives.

    Science.gov (United States)

    Minix, Nancy; And Others

    The process used to evaluate progress in identifying the goals to be used in evaluating teacher performance under the Kentucky Career Ladder Program is described. The process pertains to two areas of teacher development: (1) professional growth and development, and (2) professional leadership and initiative. A total of 1,650 individuals were asked…

  20. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Appendices; Volume 2

    Science.gov (United States)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  1. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Volume 1; Appendices

    Science.gov (United States)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  2. Rheological Properties of Extreme Pressure Greases Measured Using a Process Control Rheometer

    DEFF Research Database (Denmark)

    Glasscock, Julie; Smith, Robin S.

    2012-01-01

    A new process control rheometer (PCR) designed for use in industrial process flows has been used to measure the rheological properties of three extreme-pressure greases. The rheometer is a robust yet sensitive instrument designed to operate in an industrial processing environment in either in......-line or on-line configurations. The PCR was able to measure the rheological properties including the elastic modulus, viscous modulus, and complex viscosity of the greases which in an industrial flow application could be used as variables in a feedback system to control the process and the quality...

  3. Development of Measures to Assess Personal Recovery in Young People Treated in Specialist Mental Health Services.

    Science.gov (United States)

    John, Mary; Jeffries, Fiona W; Acuna-Rivera, Marcela; Warren, Fiona; Simonds, Laura M

    2015-01-01

    Recovery has become a central concept in mental health service delivery, and several recovery-focused measures exist for adults. The concept's applicability to young people's mental health experience has been neglected, and no measures yet exist. Aim The aim of this work is to develop measures of recovery for use in specialist child and adolescent mental health services. On the basis of 21 semi-structured interviews, three recovery measures were devised, one for completion by the young person and two for completion by the parent/carer. Two parent/carer measures were devised in order to assess both their perspective on their child's recovery and their own recovery process. The questionnaires were administered to a UK sample of 47 young people (10-18 years old) with anxiety and depression and their parents, along with a measure used to routinely assess treatment progress and outcome and a measure of self-esteem. All three measures had high internal consistency (alpha ≥ 0.89). Young people's recovery scores were correlated negatively with scores on a measure used to routinely assess treatment progress and outcome (r = -0.75) and positively with self-esteem (r = 0.84). Parent and young persons' reports of the young person's recovery were positively correlated (r = 0.61). Parent report of the young person's recovery and of their own recovery process were positively correlated (r = 0.75). The three measures have the potential to be used in mental health services to assess recovery processes in young people with mental health difficulties and correspondence with symptomatic improvement. The measures provide a novel way of capturing the parental/caregiver perspective on recovery and caregivers' own wellbeing. No tools exist to evaluate recovery-relevant processes in young people treated in specialist mental health services. This study reports on the development and psychometric evaluation of three self-report recovery-relevant assessments for young

  4. Mass measurement on the rp-process waiting point {sup 72}Kr

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, D. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Kolhinen, V.S. [Jyvaeskylae Univ. (Finland); Audi, G. [CSNSM-IN2P3-Centre National de la Recherche Scientifique (CNRS), 91 - Orsay (FR)] [and others

    2004-06-01

    The mass of one of the three major waiting points in the astrophysical rp-process {sup 72}Kr was measured for the first time with the Penning trap mass spectrometer ISOLTRAP. The measurement yielded a relative mass uncertainty of {delta}m/m=1.2 x 10{sup -7} ({delta}m=8 keV). Other Kr isotopes, also needed for astrophysical calculations, were measured with more than one order of magnitude improved accuracy. We use the ISOLTRAP masses of{sup 72-74}Kr to reanalyze the role of the {sup 72}Kr waiting point in the rp-process during X-ray bursts. (orig.)

  5. Development of a wireless blood pressure measuring device with smart mobile device.

    Science.gov (United States)

    İlhan, İlhan; Yıldız, İbrahim; Kayrak, Mehmet

    2016-03-01

    Today, smart mobile devices (telephones and tablets) are very commonly used due to their powerful hardware and useful features. According to an eMarketer report, in 2014 there were 1.76 billion smartphone users (excluding users of tablets) in the world; it is predicted that this number will rise by 15.9% to 2.04 billion in 2015. It is thought that these devices can be used successfully in biomedical applications. A wireless blood pressure measuring device used together with a smart mobile device was developed in this study. By means of an interface developed for smart mobile devices with Android and iOS operating systems, a smart mobile device was used both as an indicator and as a control device. The cuff communicating with this device through Bluetooth was designed to measure blood pressure via the arm. A digital filter was used on the cuff instead of the traditional analog signal processing and filtering circuit. The newly developed blood pressure measuring device was tested on 18 patients and 20 healthy individuals of different ages under a physician's supervision. When the test results were compared with the measurements made using a sphygmomanometer, it was shown that an average 93.52% accuracy in sick individuals and 94.53% accuracy in healthy individuals could be achieved with the new device. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Process Techno – Innovation Using TQM in Developing Countries

    Directory of Open Access Journals (Sweden)

    Fasil Taddese

    2010-08-01

    Full Text Available Techno-innovation has been competitive edge for most manufacturing companies. Rapid advancement in technology-innovation geared-up with global mega-competition has resulted in unprecedented economic growth where TQM has played major role. Despite slow economic growth in developing countries caused by incapability to develop their own technology, failure to make wise decision in adopting competent technology, and inability to properly utilize adopted technologies; tremendous developments are seen in some. Examples can be Indian companies that won the prestigious Deming Prize and Japan Quality Medal after adopting necessary technologies from Japan under TQM. We have addressed process techno-innovation by 4M (Man, Machine, Method, Material and 1E (working condition-corporate culture approach. Results indicate that TQM affects process techno-innovation by primary effect on human resource and working condition/corporate culture. Three stage gates vis-à-vis: process understanding, process improvement and technology learning, and process techno-innovation are the mechanisms through which TQM promotes process techno-innovation in developing countries.

  7. Design of Temperature Measurement System on the Drying Process of Madura Tobacco Leaves

    OpenAIRE

    Wardana, Humadillah Kurniadi; Endarko, Endarko

    2015-01-01

    The quality of dried chopped leaves of tobacco is an important factor. The present work developed an oven for drying process to measure and evalute on drying shrinkage characteristic of choped leaves Madura tobacco. The oven has three racks for analyzing and monitoring the rate of drying shrinkage of Madura tobacco. Every rack has a different amount of chopped leaves as follows: 120 g on top rack, 100 g for middle rack and 80 g for bottom rack. Rate of drying shrinkage was analyzed for 20 min...

  8. Measurement plans for process flow improvement in services and health care

    NARCIS (Netherlands)

    Kemper, B.; de Mast, J.

    2013-01-01

    The discussion of performance measurement is often on a conceptual, not operational, level; advice on the operational and practical matters of obtaining data for process flow improvement is scarce. We define a measurement plan and study four measurement study designs and corresponding methods and

  9. Improving new product development (NPD process by analyzing failure cases

    Directory of Open Access Journals (Sweden)

    Yeon-Hak Kim

    2017-01-01

    Full Text Available Purpose - The purpose of this study is to develop an appropriate new product development (NPD process of Company “T”, a medium-sized firm, by analyzing the existing NPD process and failure cases of the Company. Design/methodology/approach - The proposed research framework is as follows: first, prospective studies of the NPD process are performed using the existing literature and preliminary references; second, comparative analysis between the current processes and a NPD process is performed; third, phase-based evaluations upon failed product cases are conducted with a NPD process so as to identify the abridged steps and root-causes of failures; finally, renewed priorities are set forth by utilizing the analytic hierarchy process analysis and questionnaire analysis upon the above identified causes of failures. Findings - The resulting accomplishments include the establishment of NPD processes that resonates with the current states of Company “T”, which, in turn, ensures the increase of efficiency, the decrease in development duration and the strategy of capacity-concentration and priority-selection. Originality/value - As Company “T”’s development process is outdated and products are developed without adequate market information research and feasibility analysis, the percentage of failed development project is as high as 87 per cent. Thus, this study aims to develop an appropriate NPD process of Company “T” by analyzing the existing NPD process and failure cases of the Company.

  10. Virtual instrumentation technique used in the nuclear digital signal processing system design: Energy and time measurement tests

    International Nuclear Information System (INIS)

    Pechousek, J.; Prochazka, R.; Prochazka, V.; Frydrych, J.

    2011-01-01

    In this report, computer-based digital signal processing system with a 200 MS s -1 sampling digitizer is presented. Virtual instrumentation technique is used to easily develop a system which provides spectroscopy measurements such as amplitude and time signal analysis, with the time-of-flight facility. Several test measurements were performed to determine the characteristics of a system. The presented system may find its application in the coincidence measurement since the system is usable for different types of detectors and sensitive to decay lifetimes from tens of nanoseconds to seconds.

  11. The Process of Trust Development

    DEFF Research Database (Denmark)

    Jagd, Søren; Højland, Jeppe

    in management among employees. Trust is found to be higher among employees interacting regularly with managers, as in the project coordination group. It is found that personal relations are very important for the development of trust. The success of the project may be explained by the involvement of an ‘elite...... and discuss with colleagues from other departments and develop personal knowledge of each other....... by high trust and co-operation? In this paper we explore the process of trust development during an organisational change project in a Danish SME by looking at two kinds of trust relations: employee trust in management and trust relations among employees. We find substantial differences in trust...

  12. Intrapreneurial competencies: development and validation of a measurement scale

    Directory of Open Access Journals (Sweden)

    Tomás Vargas-Halabí

    2017-07-01

    Full Text Available Purpose - Few models have attempted to explain intrapreneurial behavior from the perspective of competencies. Therefore, the purpose of this paper is to contribute along this line by developing and validating a scale to measure intrapreneurial competencies for a Costa Rican organizational context. Design/methodology/approach - A three stage process was followed. The first stage considered literature review, expert judgment, cognitive interviews, and back-translation. In the second stage, the questionnaire was administered to a sample of 543 university professionals who worked mainly in private organizations in Costa Rica. The third stage led to evaluate of the proposed scale’s psychometric properties, including, exploratory factor analysis procedure performing by SPSS 19; confirmatory factor analysis procedures by means of structural equation modeling using EQS 6.2 version and finally, a linear regression model to obtain evidence of external criterion-related validity, performed by SPSS 19. Findings - This study provides evidence of five sub-dimensions of employee attributes, i.e., “opportunity promoter”, “proactivity”, “flexibility”, “drive”, and “risk taking” that constitute a higher-level construct called intrapreneurial competencies. The scale provided evidence of convergent, discriminant, and criterion-related validity – the latter, using an employee innovative behavior scale. Originality/value - The model offers a first step to continue studies that aim at developing a robust model of intrapreneurial competencies. This potential predictive capacity of an instrument of this nature would be useful for the business sector, particularly as a diagnostic instrument to strengthen processes of staff development in areas that promote the development of innovation and the creation of new businesses for the company.

  13. High-power ultrasonic processing: Recent developments and prospective advances

    Science.gov (United States)

    Gallego-Juarez, Juan A.

    2010-01-01

    Although the application of ultrasonic energy to produce or to enhance a wide variety of processes have been explored since about the middle of the 20th century, only a reduced number of ultrasonic processes have been established at industrial level. However, during the last ten years the interest in ultrasonic processing has revived particularly in industrial sectors where the ultrasonic technology may represent a clean and efficient tool to improve classical existing processes or an innovation alternative for the development of new processes. Such seems to be the case of relevant sectors such as food industry, environment, pharmaceuticals and chemicals manufacture, machinery, mining, etc where power ultrasound is becoming an emerging technology for process development. The possible major problem in the application of high-intensity ultrasound on industrial processing is the design and development of efficient power ultrasonic systems (generators and reactors) capable of large scale successful operation specifically adapted to each individual process. In the area of ultrasonic processing in fluid media and more specifically in gases, the development of the steppedplate transducers and other power ge with extensive radiating surface has strongly contributed to the implementation at semi-industrial and industrial stage of several commercial applications, in sectors such as food and beverage industry (defoaming, drying, extraction, etc), environment (air cleaning, sludge filtration, etc...), machinery and process for manufacturing (textile washing, paint manufacture, etc). The development of different cavitational reactors for liquid treatment in continuous flow is helping to introduce into industry the wide potential of the area of sonochemistry. Processes such as water and effluent treatment, crystallization, soil remediation, etc have been already implemented at semi-industrial and/or industrial stage. Other single advances in sectors like mining or energy have

  14. EMPRESS: A European Project to Enhance Process Control Through Improved Temperature Measurement

    Science.gov (United States)

    Pearce, J. V.; Edler, F.; Elliott, C. J.; Rosso, L.; Sutton, G.; Andreu, A.; Machin, G.

    2017-08-01

    A new European project called EMPRESS, funded by the EURAMET program `European Metrology Program for Innovation and Research,' is described. The 3 year project, which started in the summer of 2015, is intended to substantially augment the efficiency of high-value manufacturing processes by improving temperature measurement techniques at the point of use. The project consortium has 18 partners and 5 external collaborators, from the metrology sector, high-value manufacturing, sensor manufacturing, and academia. Accurate control of temperature is key to ensuring process efficiency and product consistency and is often not achieved to the level required for modern processes. Enhanced efficiency of processes may take several forms including reduced product rejection/waste; improved energy efficiency; increased intervals between sensor recalibration/maintenance; and increased sensor reliability, i.e., reduced amount of operator intervention. Traceability of temperature measurements to the International Temperature Scale of 1990 (ITS-90) is a critical factor in establishing low measurement uncertainty and reproducible, consistent process control. Introducing such traceability in situ (i.e., within the industrial process) is a theme running through this project.

  15. Inline Monitors for Measuring Cs-137 in the SRS Caustic Side Solvent Extraction Process

    Energy Technology Data Exchange (ETDEWEB)

    Casella, V

    2006-04-24

    The Department of Energy (DOE) selected Caustic-Side Solvent Extraction (CSSX) as the preferred technology for the removal of radioactive cesium from High-Level Waste (HLW) at the Savannah River Site (SRS). Before the full-scale Salt Waste Processing Facility (SWPF) becomes operational, a portion of dissolved saltcake waste will be processed through a Modular CSSX Unit (MCU). The MCU employs the CSSX process, a continuous process that uses a novel solvent to extract cesium from waste and concentrate it in dilute nitric acid. Of primary concern is Cs-137 which makes the solution highly radioactive. Since the MCU does not have the capacity to wait for sample results while continuing to operate, the Waste Acceptance Strategy is to perform inline analyses. Gamma-ray monitors are used to: measure the Cs-137 concentration in the decontaminated salt solution (DSS) before entering the DSS Hold Tank; measure the Cs-137 concentration in the strip effluent (SE) before entering the SE Hold Tank; and verify proper operation of the solvent extraction system by verifying material balance within the process. Since this gamma ray monitoring system application is unique, specially designed shielding was developed and software was written and acceptance tested by Savannah River National Laboratory (SRNL) personnel. The software is a LabView-based application that serves as a unified interface for controlling the monitor hardware and communicating with the host Distributed Control System. This paper presents the design, fabrication and implementation of this monitoring system.

  16. Decoherence assisting a measurement-driven quantum evolution process

    International Nuclear Information System (INIS)

    Roa, Luis; Olivares-Renteria, G. A.

    2006-01-01

    We study the problem of driving an unknown initial mixed quantum state onto a known pure state without using unitary transformations. This can be achieved, in an efficient manner, with the help of sequential measurements on at least two unbiased bases. However here we found that, when the system is affected by a decoherence mechanism, only one observable is required in order to achieve the same goal. In this way the decoherence can assist the process. We show that, depending on the sort of decoherence, the process can converge faster or slower than the method implemented by means of two complementary observables

  17. Data-processing system for bubble-chamber photographs based on PUOS-4 measuring projectors and an ES-1045 computer

    International Nuclear Information System (INIS)

    Ermolov, P.F.; Kozlov, V.V.; Rukovichkin, V.P.

    1988-01-01

    A system is described that was developed at the Scientific-Research Institute of Nuclear Physics for processing of the data recorded on stereoscopic photographs from large bubble chambers and hybrid spectrometers using PUOS-4 measuring projectors, an Elektronika-60 microcomputer, and an ES-1045 computer. The system structure, the main programmable interfaces, and the intercomputer communications are examined. The mean-square error of the measuring channels of the system, determined from calibration measurements, is within 1.3-3.5 μm; the standard deviation of the coordinates of the measured points with respect to the track in the plane of the photograph is 6 μm. The system is widely used at the institute for analysis of data from experiments in high-energy physics performed with the European Hybrid Spectrometer and the Mirabel large bubble chamber. Approximately 80,000 stereoscopic photographs have been processed and the system is being prepared to process data from the Skat bubble chamber and a spectrometer with a vertex detector that is under construction

  18. Development of the in vivo measurement system of bone mineral content using monoenergetic gamma rays

    International Nuclear Information System (INIS)

    Nardocci, A.C.

    1990-08-01

    A system, developed for in vivo measurement of bone mineral content (BMC) using monoenergetic gamma-rays of 241 Am, is described. It presents a discussion of the theoretical and practical aspects of the technique, with details of acquisition and data processing and also discusses the calibration procedure used. The results obtained with in vivo measurements are presented and BMC values of clinically normal subjects and chronic renal patients are compared. (author)

  19. Evaluation of fracturing process of soft rocks at great depth by AE measurement and DEM simulation

    International Nuclear Information System (INIS)

    Aoki, Kenji; Mito, Yoshitada; Kurokawa, Susumu; Matsui, Hiroya; Niunoya, Sumio; Minami, Masayuki

    2007-01-01

    The authors developed the stress-based evaluation system of EDZ by AE monitoring and Distinct Element Method (DEM) simulation. In order to apply this system to the soft rock site, the authors try to grasp the relationship between AE parameters, stress change and rock fracturing process by performing the high stiffness tri-axial compression tests including AE measurements on the soft rock samples, and its simulations by DEM using bonded particle model. As the result, it is found that change in predominant AE frequency is effective to evaluate fracturing process in sedimentary soft rocks, and the relationship between stress change and fracturing process is also clarified. (author)

  20. A Software Development Simulation Model of a Spiral Process

    OpenAIRE

    Carolyn Mizell; Linda Malone

    2009-01-01

    This paper will present a discrete event simulation model of a spiral development lifecycle that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process. There is a need for simulation models of software development processes other than the waterfall due to new processes becoming more widely used in order to overcome the limitations of the traditional waterfall lifecycle. The use of a spiral process can make the inherently difficult job of...

  1. Development, validation and routine control of a radiation process

    International Nuclear Information System (INIS)

    Kishor Mehta

    2010-01-01

    Today, radiation is used in industrial processing for variety of applications; from low doses for blood irradiation to very high doses for materials modification and even higher for gemstone colour enhancement. At present, radiation is mainly provided by either radionuclides or machine sources; cobalt-60 is the most predominant radionuclide in use. Currently, there are several hundred irradiation facilities worldwide. Similar to other industries, quality management systems can assist radiation processing facilities in enhancing customer satisfaction and maintaining and improving product quality. To help fulfill quality management requirements, several national and international organizations have developed various standards related to radiation processing. They all have requirements and guidelines for development, validation and routine control of the radiation process. For radiation processing, these three phases involve the following activities. Development phase includes selecting the type of radiation source, irradiation facility and the dose required for the process. Validation phase includes conducting activities that give assurance that the process will be successful. Routine control then involves activities that provide evidence that the process has been successfully realized. These standards require documentary evidence that process validation and process control have been followed. Dosimetry information gathered during these processes provides this evidence. (authors)

  2. Development of Processed Products from Guapple

    Directory of Open Access Journals (Sweden)

    Teresita Acevedo

    1995-12-01

    Full Text Available The study aimed to develop processed products from guapple. Characterization of the guapple fruit was intially conducted before proceeding to formulation studies.The following characteristics of the guapple fruit were observed: color of outer skin - yellow green with Munsell notation of 10 Y7/6, color of inner flesh - off white with Munsell notation of 7.5Y 8/2; texture, 20.4-37.1 mm; average weight per piece, from 219 to 420 gms; ph, 3.7; titrable acidity (citric acid, 0.34%, and soluble solids, 2.6° Brix.The identified processed products from guapple were puree, pickles, and preserves. Standardized processes and formulations for each of these products were developed in semi-pilot scale. Removal of the skin for the guapple preserves and pickles was facilitated using 5% brine-l % CaCI2.Suitable packaging materials were also identified. Flexible films were used for guapple puree while glass jars and flexible films were found to be satisfactory both for guapple pickles and preserves.Physico-chemical, microbiological, and sensory evaluation were done after two months of storage. Based on these tests, the pasteurization process of 180° F for 20 minutes for puree and 10 minutes for preserves and pickles, was found to make the products commercially sterile.

  3. Biocatalytic process development using microfluidic miniaturized systems

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Heintz, Søren; Ringborg, Rolf Hoffmeyer

    2014-01-01

    The increasing interest in biocatalytic processes means there is a clear need for a new systematic development paradigm which encompasses both protein engineering and process engineering. This paper argues that through the use of a new microfluidic platform, data can be collected more rapidly...

  4. Application of passive sonar technology to mineral processing and oil sands applications : if you can measure it, you can manage it

    Energy Technology Data Exchange (ETDEWEB)

    O' Keefe, C.; Viega, J.; Fernald, M. [CiDRA Corp., Wallingford, CT (United States)

    2007-07-01

    SONAR-based flow and entrained air measurement instruments were described. This new class of industrial flow and compositional analyzers was developed by CiDRA to provide new measurement insight and quantifiable value to industrial process operators. Passive sonar array-based processing units have been installed worldwide in several industrial applications and are particularly suited for a wide range of mineral processing applications, including slurry flow rate measurement and fluid characterization. This paper also described the SONAR-based, clamp-on SONARtrac technology, a scalable platform that provides several other value added measurements and information such as speed of sound, entrained air/gas, gas hold-up, and velocity profile. Oil sands, tailings and bitumen slurries present considerable measurement challenges for in-line flow measurement devices in terms of measurement accuracy, reliability and maintenance. The sonar-based technology platform has been used in a variety of oil sands processes, hydrotransport, and minerals beneficiation applications. This paper described these applications with particular reference to difficult slurry flow measurement and control in the areas of comminution and flotation such as mill discharge, hydrocyclone feed/overflow, final concentrate, thickener discharge, and tailings. 5 refs., 4 tabs., 23 figs.

  5. Measurement and evaluation of sustainable development

    International Nuclear Information System (INIS)

    Kondyli, Julia

    2010-01-01

    This paper develops a methodology to analyse, measure and evaluate sustainable development (SD). A holistic approach (systems analysis) is applied to operationalise the SD concept and an integrated approach (composite indicator construction) is adopted for the measurement of SD. The operationalisation of the SD concept is based on an in-depth systems analysis of issues associated with economic, social and environmental problems in a policy context. The composite indicator (overall sustainability index) is developed based on the three composite sub-indicators of the SD dimensions. The valuation of the SD is based both on the aggregated sub-indicators and the overall composite indicator. The methodology is used to evaluate the SD of the North Aegean islands between different temporal points. The assessment of the change in the islands' SD is based on a quartile grading scale of the overall SD composite scores.

  6. On the development of a magnetoresistive sensor for blade tip timing and blade tip clearance measurement systems

    Science.gov (United States)

    Tomassini, R.; Rossi, G.; Brouckaert, J.-F.

    2016-10-01

    A simultaneous blade tip timing (BTT) and blade tip clearance (BTC) measurement system enables the determination of turbomachinery blade vibrations and ensures the monitoring of the existing running gaps between the blade tip and the casing. This contactless instrumentation presents several advantages compared to the well-known telemetry system with strain gauges, at the cost of a more complex data processing procedure. The probes used can be optical, capacitive, eddy current as well as microwaves, everyone with its dedicated electronics and many existing different signal processing algorithms. Every company working in this field has developed its own processing method and sensor technology. Hence, repeating the same test with different instrumentations, the answer is often different. Moreover, rarely it is possible to achieve reliability for in-service measurements. Developments are focused on innovative instrumentations and a common standard. This paper focuses on the results achieved using a novel magnetoresistive sensor for simultaneous tip timing and tip clearance measurements. The sensor measurement principle is described. The sensitivity to gap variation is investigated. In terms of measurement of vibrations, experimental investigations were performed at the Air Force Institute of Technology (ITWL, Warsaw, Poland) in a real aeroengine and in the von Karman Institute (VKI) R2 compressor rig. The advantages and limitations of the magnetoresistive probe for turbomachinery testing are highlighted.

  7. Sustainable Chemical Process Development through an Integrated Framework

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Kumar Tula, Anjan; Anantpinijwatna, Amata

    2016-01-01

    This paper describes the development and the application of a general integrated framework based on systematic model-based methods and computer-aided tools with the objective to achieve more sustainable process designs and to improve the process understanding. The developed framework can be appli...... studies involve multiphase reaction systems for the synthesis of active pharmaceutical ingredients....

  8. Gaussian process based intelligent sampling for measuring nano-structure surfaces

    Science.gov (United States)

    Sun, L. J.; Ren, M. J.; Yin, Y. H.

    2016-09-01

    Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

  9. Development of advanced spent fuel management process. System analysis of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, S.G.; Kang, D.S.; Seo, C.S.; Lee, H.H.; Shin, Y.J.; Park, S.W.

    1999-03-01

    The system analysis of an advanced spent fuel management process to establish a non-proliferation model for the long-term spent fuel management is performed by comparing the several dry processes, such as a salt transport process, a lithium process, the IFR process developed in America, and DDP developed in Russia. In our system analysis, the non-proliferation concept is focused on the separation factor between uranium and plutonium and decontamination factors of products in each process, and the non-proliferation model for the long-term spent fuel management has finally been introduced. (Author). 29 refs., 17 tabs., 12 figs

  10. Plasma assisted measurements of alkali metal concentrations in pressurized combustion processes

    International Nuclear Information System (INIS)

    Hernberg, R.; Haeyrinen, V.

    1995-01-01

    The plasma assisted method for continuous measurement of alkali metal concentrations in product gas flows of pressurized energy processes will be tested and applied at the 1.6 MW PFBC/G facility at Delft University of Technology in the Netherlands. Measurements will be performed during 1995 and 1996 at different stages of the research programme. The results are expected to give information about the influence of different process conditions on the generation of alkali metal vapours, the comparison of different methods for alkali measurement and the specific performance of our system. The project belongs to the Joule II extension program under contract JOU2-CT93-0431. (author)

  11. Process Development for Nanostructured Photovoltaics

    Energy Technology Data Exchange (ETDEWEB)

    Elam, Jeffrey W.

    2015-01-01

    Photovoltaic manufacturing is an emerging industry that promises a carbon-free, nearly limitless source of energy for our nation. However, the high-temperature manufacturing processes used for conventional silicon-based photovoltaics are extremely energy-intensive and expensive. This high cost imposes a critical barrier to the widespread implementation of photovoltaic technology. Argonne National Laboratory and its partners recently invented new methods for manufacturing nanostructured photovoltaic devices that allow dramatic savings in materials, process energy, and cost. These methods are based on atomic layer deposition, a thin film synthesis technique that has been commercialized for the mass production of semiconductor microelectronics. The goal of this project was to develop these low-cost fabrication methods for the high efficiency production of nanostructured photovoltaics, and to demonstrate these methods in solar cell manufacturing. We achieved this goal in two ways: 1) we demonstrated the benefits of these coatings in the laboratory by scaling-up the fabrication of low-cost dye sensitized solar cells; 2) we used our coating technology to reduce the manufacturing cost of solar cells under development by our industrial partners.

  12. Soft sensor design by multivariate fusion of image features and process measurements

    DEFF Research Database (Denmark)

    Lin, Bao; Jørgensen, Sten Bay

    2011-01-01

    This paper presents a multivariate data fusion procedure for design of dynamic soft sensors where suitably selected image features are combined with traditional process measurements to enhance the performance of data-driven soft sensors. A key issue of fusing multiple sensor data, i.e. to determine...... with a multivariate analysis technique from RGB pictures. The color information is also transformed to hue, saturation and intensity components. Both sets of image features are combined with traditional process measurements to obtain an inferential model by partial least squares (PLS) regression. A dynamic PLS model...... oxides (NOx) emission of cement kilns. On-site tests demonstrate improved performance over soft sensors based on conventional process measurements only....

  13. Testing a model of componential processing of multi-symbol numbers-evidence from measurement units.

    Science.gov (United States)

    Huber, Stefan; Bahnmueller, Julia; Klein, Elise; Moeller, Korbinian

    2015-10-01

    Research on numerical cognition has addressed the processing of nonsymbolic quantities and symbolic digits extensively. However, magnitude processing of measurement units is still a neglected topic in numerical cognition research. Hence, we investigated the processing of measurement units to evaluate whether typical effects of multi-digit number processing such as the compatibility effect, the string length congruity effect, and the distance effect are also present for measurement units. In three experiments, participants had to single out the larger one of two physical quantities (e.g., lengths). In Experiment 1, the compatibility of number and measurement unit (compatible: 3 mm_6 cm with 3 mm) as well as string length congruity (congruent: 1 m_2 km with m 2 characters) were manipulated. We observed reliable compatibility effects with prolonged reaction times (RT) for incompatible trials. Moreover, a string length congruity effect was present in RT with longer RT for incongruent trials. Experiments 2 and 3 served as control experiments showing that compatibility effects persist when controlling for holistic distance and that a distance effect for measurement units exists. Our findings indicate that numbers and measurement units are processed in a componential manner and thus highlight that processing characteristics of multi-digit numbers generalize to measurement units. Thereby, our data lend further support to the recently proposed generalized model of componential multi-symbol number processing.

  14. 3D MEMS in Standard Processes: Fabrication, Quality Assurance, and Novel Measurement Microstructures

    Science.gov (United States)

    Lin, Gisela; Lawton, Russell A.

    2000-01-01

    Three-dimensional MEMS microsystems that are commercially fabricated require minimal post-processing and are easily integrated with CMOS signal processing electronics. Measurements to evaluate the fabrication process (such as cross-sectional imaging and device performance characterization) provide much needed feedback in terms of reliability and quality assurance. MEMS technology is bringing a new class of microscale measurements to fruition. The relatively small size of MEMS microsystems offers the potential for higher fidelity recordings compared to macrosize counterparts, as illustrated in the measurement of muscle cell forces.

  15. THE RELIABILITY AND ACCURACY OF THE TRIPLE MEASUREMENTS OF ANALOG PROCESS VARIABLES

    Directory of Open Access Journals (Sweden)

    V. A. Anishchenko

    2017-01-01

    Full Text Available The increase in unit capacity of electric equipment as well as complication of technological processes, devices control and management of the latter in power plants and substations demonstrate the need to improve the reliability and accuracy of measurement information characterizing the state of the objects being managed. The mentioned objective is particularly important for nuclear power plants, where the price of inaccuracy of measurement responsible process variables is particularly high and the error might lead to irreparable consequences. Improving the reliability and accuracy of measurements along with the improvement of the element base is provided by methods of operational validation. These methods are based on the use of information redundancy (structural, topological, temporal. In particular, information redundancy can be achieved by the simultaneous measurement of one analog variable by two (duplication or three devices (triplication i.e., triple redundancy. The problem of operational control of the triple redundant system of measurement of electrical analog variables (currents, voltages, active and reactive power and energy is considered as a special case of signal processing by an orderly sampling on the basis of majority transformation and transformation being close to majority one. Difficulties in monitoring the reliability of measurements are associated with the two tasks. First, one needs to justify the degree of truncation of the distributions of random errors of measurements and allowable residuals of the pairwise differences of the measurement results. The second task consists in formation of the algorithm of joint processing of a set of separate measurements determined as valid. The quality of control is characterized by the reliability, which adopted the synonym of validity, and accuracy of the measuring system. Taken separately, these indicators might lead to opposite results. A compromise solution is therefore proposed

  16. Relational description of the measurement process in quantum field theory

    International Nuclear Information System (INIS)

    Gambini, Rodolfo; Porto, Rafael A.

    2002-01-01

    We have recently introduced a realistic, covariant, interpretation for the reduction process in relativistic quantum mechanics. The basic problem for a covariant description is the dependence of the states on the frame within which collapse takes place. A suitable use of the causal structure of the devices involved in the measurement process allowed us to introduce a covariant notion for the collapse of quantum states. However, a fully consistent description in the relativistic domain requires the extension of the interpretation to quantum fields. The extension is far from straightforward. Besides the obvious difficulty of dealing with the infinite degrees of freedom of the field theory, one has to analyse the restrictions imposed by causality concerning the allowed operations in a measurement process. In this paper we address these issues. We shall show that, in the case of partial causally connected measurements, our description allows us to include a wider class of causal operations than the one resulting from the standard way of computing conditional probabilities. This alternative description could be experimentally tested. A verification of this proposal would give stronger support to the realistic interpretations of the states in quantum mechanics. (author)

  17. THE PROGRAM-TARGET PLANNING AND MANAGEMENT OF DEVELOPMENT OF MEASURING EQUIPMENT PARK

    Directory of Open Access Journals (Sweden)

    Marichev Pavel Aleksandrovich

    2018-02-01

    Full Text Available Subject: study of the Park of Measuring Equipment (PME that includes hundreds of thousands of standard samples, measuring instruments, control and measuring devices and other measuring mechanisms with different areas of application, levels of reliability, service life, levels of technical perfection and levels of technical condition. Research objectives: 1. Development of a complex of mathematical models to simulate the processes of development of PME, control indicators of PME performance as a whole, purposefully control the stages of life cycle of measuring equipment samples. 2. Development of the method which, with a sufficient degree of validity and objectivity, would solve the tasks of management of procurement and repairs both in preparation of proposals for preliminary long-term plan documents (LTPD and to ensure control over the implementation of adopted plans. Thus, the method being developed should be fairly simple to use, easily adjustable for solving problems of different dimensions, suitable for solving the optimal control problem for PME as a whole, for a part of PME, and also suitable for solving a generalized problem for certain “aggregated objects” such as the Metrology Centers. Materials and methods: the methods of mathematical simulation, methods of comparative analysis, simplex method for solving linear programming problem, methods of program-target planning were used. Results: an approach to the solution of problems of program-target planning based on solving a series of linear programming problems has been developed. The results have been presented of using the approach both for formulation of proposals into the preliminary LTPD and also for introducing revisions (amendments to annual plans, which are implemented in the framework of the state defense order. Conclusions: the described method and algorithms constitute an effective tool for solving practical problems of target-oriented management of PME performance

  18. Development of nonintrusive, scatter-independent techniques for measurement of liquid density inside dense sprays

    Science.gov (United States)

    Hartfield, Roy

    1994-01-01

    A nonintrusive optical technique for measuring the liquid density in sprays used to simulate LOX injector flows is under development. This manuscript is a report on work toward that development which is currently in progress. The technique is a scatter-independent, absorption-based approach which depends on the numerical inversion of a collection of absorption profiles. For the case in which visible radiation passes through liquid-gas interfaces so numerous in sprays, substantial reductions and alterations in the signal result from scattering even in the absence of absorption. To avoid these problems, X-Rays will be used as the absorbed radiation. The experimental process is simulated by integrating the absorption spectrum for a known distribution, adding instrument noise to this 'measurement', creating a projection from the 'measurement', filtering the projection, inverting the projection, and comparing the results with the original prescribed distribution.

  19. Development of the four group partitioning process at JAERI

    International Nuclear Information System (INIS)

    Kubota, Masumitsu; Morita, Yasuji; Yamaguchi, Isoo; Yamagishi, Isao; Fujiwara, T.; Watanabe, Masayuki; Mizoguchi, Kenichi; Tatsugae, Ryozo

    1999-01-01

    At JAERI, development of a partitioning method started about 24 years ago. From 1973 to 1984, a partitioning process was developed for separating elements in HLLW into 3 groups; TRU, Sr-Cs and others. The partitioning process consisted of three steps; solvent extraction of U and Pu with TBP, solvent extraction of Am and Cm with DIDPA, and adsorption of Sr and Cs with inorganic ion exchangers. The process was demonstrated with real HLLW. Since 1985, a four group partitioning process has been developed, in which a step for separating the Tc-PGM group was developed in addition to the three group separation. Effective methods for separating TRU, especially Np, and Tc have been developed. In this paper, the flow sheet of the four group partitioning and the results of tests with simulated and real HLLW in NUCEF hot-cell are shown. (J.P.N.)

  20. In situ resistance measurements of bronze process Nb-Sn-Cu-Ta multifilamentary composite conductors during reactive diffusion

    International Nuclear Information System (INIS)

    Tan, K S; Hopkins, S C; Glowacki, B A; Majoros, M; Astill, D

    2004-01-01

    The conditions under which the Nb 3 Sn intermetallic layer is formed by solid-state reactive diffusion processes in bronze process multifilamentary conductors greatly influence the performance of the conductors. By convention, isothermal heat treatment is used and often causes non-uniformity of A15 layers formed across the wire. Therefore, characterization and optimization of the conductor during the reactive diffusion processes is crucial in order to improve the overall conductor's performance. In this paper, a different characterization approach and perhaps an optimization technique is presented, namely in situ resistance measurement by an alternating current (AC) method. By treating the components of such multifilamentary wires as a set of parallel resistors, the resistances of the components may be combined using the usual rules for resistors in parallel. The results show that the resistivity of the entire wire changes significantly during the reactive diffusion processes. The development of the Nb 3 Sn layer in bronze process Nb-Sn-Cu-Ta multifilamentary wires at different stages of the reactive diffusion processes has been monitored using measured resistivity changes, and correlated with results from DTA, ACS, SEM and EDS

  1. New limit theorems for regular diffusion processes with finite speed measure

    NARCIS (Netherlands)

    J.H. van Zanten (Harry)

    2000-01-01

    textabstractWe derive limit theorems for diffusion processes that have a finite speed measure. First we prove a number of asymptotic properties of the density $rho_t = dmu_t /dmu$ of the empirical measure $mu_t$ with respect to the normalized speed measure $mu$. These results are then used to derive

  2. Analysis and Automation of Calibration Process for Measurement Coils for Particle Accelerator Magnets

    CERN Document Server

    AUTHOR|(CDS)2226624; Azzam, Jamal; Chanthery, Elodie

    Various techniques are used to measure magnetic fields within the Magnetic Measurement (MM) section, all of which require the use of specific sensors. These sensors need to be calibrated in order to map the resulting signals to the characteristics of the magnetic field. Thus, the calibration procedure would benefit from being improved and made more stable, efficient, and technologically up-to-date. Consequently, to improve the efficiency of the calibration procedure and the work processes it was first necessary to analyze them and identify potential issues and weaknesses to be further addressed. The calibration procedure mainly suffered from too many manual operations, which were potential sources of error, and from outdated readout and data acquisition software and hardware. Firstly, a new software program had to be developed using a C++ framework called Flexible Framework for Magnetic Measurements (FFMM), that would automate the data acquisition and analysis. Next, a feasibility study had to be done ...

  3. Microscopic image processing system for measuring nonuniform film thickness profiles: Image scanning ellipsometry

    International Nuclear Information System (INIS)

    Liu, A.H.; Plawsky, J.L.; Wayner, P.C. Jr.

    1993-01-01

    The long-term objective of this research program is to determine the stability and heat transfer characteristics of evaporating thin films. The current objective is to develop and use a microscopic image-processing system (IPS) which has two parts: an image analyzing interferometer (IAI) and an image scanning ellipsometer (ISE). The primary purpose of this paper is to present the basic concept of ISE, which is a novel technique to measure the two dimensional thickness profile of a non-uniform, thin film, from several nm up to several μm, in a steady state as well as in a transient state. It is a full-field imaging technique which can study every point on the surface simultaneously with high spatial resolution and thickness sensitivity, i.e., it can measure and map the 2-D film thickness profile. The ISE was tested by measuring the thickness profile and the refractive index of a nonuniform solid film

  4. The Development and Current State of Translation Process Research

    DEFF Research Database (Denmark)

    Lykke Jakobsen, Arnt

    2014-01-01

    The development and current state of translation process research ch Arnt Lykke Jakobsen Copenhagen Business School lInterest in process-oriented translation studies has been intense for the past almost half a century. Translation process research (TPR) is the label we have used to refer to a spe...... itself, into regions like cognitive psychology, psycho- and neurolinguistics, and neuroscience, where the interest in what goes on in our heads is also very strong.......The development and current state of translation process research ch Arnt Lykke Jakobsen Copenhagen Business School lInterest in process-oriented translation studies has been intense for the past almost half a century. Translation process research (TPR) is the label we have used to refer...... which simultaneously tracks the translator’s eye movements across a screen displaying both a source text and the translator’s emerging translation. This research method was developed as a means of qualifying and strengthening translation process hypotheses based on verbal reports by providing additional...

  5. Measuring the Return on Information Technology: A Knowledge-Based Approach for Revenue Allocation at the Process and Firm Level

    National Research Council Canada - National Science Library

    Pavlou, Paul A; Housel, Thomas J; Rodgers, Waymond; Jansen, Erik

    2005-01-01

    ...., firm or process level). To address this issue, the study aims to develop a method for allocating the revenue and cost of IT initiatives at any level of analysis using a common unit of measurement...

  6. Softwareland Chronicles: A Software Development Meta-Process Proposal

    Directory of Open Access Journals (Sweden)

    Bolanos Sandro

    2016-05-01

    Full Text Available This paper presents the software development meta-process (SD-MP as a proposal to set up software projects. Within this proposal we offer conceptual elements that help solve the war of methodologies and processes in favor of an integrating viewpoint, where the main flaws associated with conventional and agile approaches are removed. Our newly developed software platform to support the meta-process is also presented together with three case studies involving projects currently in progress, where the framework proposed in SD-MP has been applied.

  7. Risk constraint measures developed for the outcome-based strategy for tank waste management

    International Nuclear Information System (INIS)

    Harper, B.L.; Gajewski, S.J.; Glantz, C.L.

    1996-09-01

    This report is one of a series of supporting documents for the outcome-based characterization strategy developed by PNNL. This report presents a set of proposed risk measures with risk constraint (acceptance) levels for use in the Value of Information process used in the NCS. The characterization strategy has developed a risk-based Value of Information (VOI) approach for comparing the cost-effectiveness of characterizing versus mitigating particular waste tanks or tank clusters. The preference between characterizing or mitigating in order to prevent an accident depends on the cost of those activities relative to the cost of the consequences of the accident. The consequences are defined as adverse impacts measured across a broad set of risk categories such as worker dose, public cancers, ecological harm, and sociocultural impacts. Within each risk measure, various open-quotes constraint levelsclose quotes have been identified that reflect regulatory standards or conventionally negotiated thresholds of harm to Hanford resources and values. The cost of consequences includes the open-quotes costs close-quote of exceeding those constraint levels as well as a strictly linear costing per unit of impact within each of the risk measures. In actual application, VOI based-decision making is an iterative process, with a preliminary low-precision screen of potential technical options against the major risk constraints, followed by VOI analysis to determine the cost-effectiveness of gathering additional information and to select a preferred technical option, and finally a posterior screen to determine whether the preferred option meets all relevant risk constraints and acceptability criteria

  8. Reflection of successful anticancer drug development processes in the literature.

    Science.gov (United States)

    Heinemann, Fabian; Huber, Torsten; Meisel, Christian; Bundschus, Markus; Leser, Ulf

    2016-11-01

    The development of cancer drugs is time-consuming and expensive. In particular, failures in late-stage clinical trials are a major cost driver for pharmaceutical companies. This puts a high demand on methods that provide insights into the success chances of new potential medicines. In this study, we systematically analyze publication patterns emerging along the drug discovery process of targeted cancer therapies, starting from basic research to drug approval - or failure. We find clear differences in the patterns of approved drugs compared with those that failed in Phase II/III. Feeding these features into a machine learning classifier allows us to predict the approval or failure of a targeted cancer drug significantly better than educated guessing. We believe that these findings could lead to novel measures for supporting decision making in drug development. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Rational and systematic protein purification process development: the next generation.

    Science.gov (United States)

    Nfor, Beckley K; Verhaert, Peter D E M; van der Wielen, Luuk A M; Hubbuch, Jürgen; Ottens, Marcel

    2009-12-01

    Current biopharmaceutical manufacturing strongly relies on using purification platform processes, offering harmonization of practices and speed-to-market. However, the ability of such processes to respond quickly to anticipated higher quality and capacity demands is under question. Here, we describe novel approaches for purification process development that incorporate biothermodynamics, modern high throughput experimentation and simulation tools. Such development leads to production platform-specific databases containing thermodynamic protein descriptors of major host cell proteins over a range of experimental conditions. This will pave the way for in silico purification process development, providing better process understanding and the potential to respond quickly to product quality and market demands. Future efforts will focus on improving this field further and enabling more rationale in process development.

  10. Process-Based Quality (PBQ) Tools Development; TOPICAL

    International Nuclear Information System (INIS)

    Cummins, J.L.

    2001-01-01

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts

  11. Development and Processing Improvement of Aerospace Aluminum Alloys

    Science.gov (United States)

    Lisagor, W. Barry; Bales, Thomas T.

    2007-01-01

    This final report, in multiple presentation format, describes a comprehensive multi-tasked contract study to improve the overall property response of selected aerospace alloys, explore further a newly-developed and registered alloy, and correlate the processing, metallurgical structure, and subsequent properties achieved with particular emphasis on the crystallographic orientation texture developed. Modifications to plate processing, specifically hot rolling practices, were evaluated for Al-Li alloys 2195 and 2297, for the recently registered Al-Cu-Ag alloy, 2139, and for the Al-Zn-Mg-Cu alloy, 7050. For all of the alloys evaluated, the processing modifications resulted in significant improvements in mechanical properties. Analyses also resulted in an enhanced understanding of the correlation of processing, crystallographic texture, and mechanical properties.

  12. Development of Turbulence-Measuring Equipment

    Science.gov (United States)

    Kovasznay, Leslie S G

    1954-01-01

    Hot wire turbulence-measuring equipment has been developed to meet the more stringent requirements involved in the measurement of fluctuations in flow parameters at supersonic velocities. The higher mean speed necessitates the resolution of higher frequency components than at low speed, and the relatively low turbulence level present at supersonic speed makes necessary an improved noise level for the equipment. The equipment covers the frequency range from 2 to about 70,000 cycles per second. Constant-current operation is employed. Compensation for hot-wire lag is adjusted manually using square-wave testing to indicate proper setting. These and other features make the equipment adaptable to all-purpose turbulence work with improved utility and accuracy over that of older types of equipment. Sample measurements are given to demonstrate the performance.

  13. Developing performance measurement systems as enabling formalization : A longitudinal field study of a logistics department

    NARCIS (Netherlands)

    Wouters, Marc; Wilderom, Celeste P.M.

    2008-01-01

    This paper reports on a developmental approach to performance-measurement systems (PMS). In particular, we look at characteristics of a development process that result in the PMS being perceived by employees as enabling of their work, rather than as primarily a control device for use by senior

  14. Development of hydraulic analysis code for optimizing thermo-chemical is process reactors

    International Nuclear Information System (INIS)

    Terada, Atsuhiko; Hino, Ryutaro; Hirayama, Toshio; Nakajima, Norihiro; Sugiyama, Hitoshi

    2007-01-01

    The Japan Atomic Energy Agency has been conducting study on thermochemical IS process for water splitting hydrogen production. Based on the test results and know-how obtained through the bench-scale test, a pilot test plant, which has a hydrogen production performance of 30 Nm 3 /h, is being designed conceptually as the next step of the IS process development. In design of the IS pilot plant, it is important to make chemical reactors compact with high performance from the viewpoint of plant cost reduction. A new hydraulic analytical code has been developed for optimizing mixing performance of multi-phase flow involving chemical reactions especially in the Bunsen reactor. Complex flow pattern with gas-liquid chemical interaction involving flow instability will be characterized in the Bunsen reactor. Preliminary analytical results obtained with above mentioned code, especially flow patterns induced by swirling flow agreed well with that measured by water experiments, which showed vortex breakdown pattern in a simplified Bunsen reactor. (author)

  15. The transformation factor: a measure for the productive behaviour of a manufacturing process

    NARCIS (Netherlands)

    Ron, de A.J.

    1993-01-01

    By using advanced manufacturing processes, production results should increase. Nevertheless managers have their doubts to invest in such processes because of the financial risks and the absence of adequate technical and economical measures which should support their decisions. Measures which contain

  16. Highlights from panel discussion on key issues for future developments in microwave processing

    International Nuclear Information System (INIS)

    Gac, F.D.; Iskander, M.F.

    1992-01-01

    This paper reports on highlights from a panel discussion on Key Issues for Future Development in Microwave Processing. Although the panelists represented a mix of individuals from government, academia, and industry, only one aspect of industry was represented, namely microwave system manufacturers. For further panel discussions, it is recommended that the materials manufacturing (i.e., microwave user) sector also be represented. Three important points emerged from the panel discussion. The first deals with the credibility and usability of information, be it dielectric property measurements, experimental procedures, or microwave processing results. Second, a considerable communication and education gap continues to exist between the materials community and microwave engineers. Finally, a more realistic approach should be taken in identifying where microwave processing makes sense

  17. Expected Influence of Ethics on Product Development Process

    Directory of Open Access Journals (Sweden)

    Stig Larsson

    2008-07-01

    Full Text Available Product development efficiency and effectiveness is depending on a process being well executed. The actions of individuals included in the processes are influenced by the ethical and moral orientations that have been selected by each individual, whether this selection is conscious or not. This paper describes different ethical choices and the expected effects they may have on the development process exemplified by the product integration process for software products. The different frameworks analyzed are utilitarianism, rights ethics, duty ethics, virtue ethics and ethical egoism. The expected effects on the goals for product integration may be debated. This is a result in it self as it triggers discussions about ethical considerations and increase the awareness of the influence of moral decisions. Our conclusion is that the adherence to specific moral frameworks simplifies the alignment of actions to the practices described in product development models and standards and through this supports a more successful execution of product development projects. This conclusion is also confirmed through a comparison between the different directions and several codes of ethics for engineers issued by organizations such as IEEE as these combine features from several of the discussed ethical directions.

  18. Process development and tooling design for intrinsic hybrid composites

    Science.gov (United States)

    Riemer, M.; Müller, R.; Drossel, W. G.; Landgrebe, D.

    2017-09-01

    Hybrid parts, which combine the advantages of different material classes, are moving into the focus of lightweight applications. This development is amplified by their high potential for usage in the field of crash relevant structures. By the current state of the art, hybrid parts are mainly made in separate, subsequent forming and joining processes. By using the concept of an intrinsic hybrid, the shaping of the part and the joining of the different materials are performed in a single process step for shortening the overall processing time and thereby the manufacturing costs. The investigated hybrid part is made from continuous fibre reinforced plastic (FRP), in which a metallic reinforcement structure is integrated. The connection between these layered components is realized by a combination of adhesive bonding and a geometrical form fit. The form fit elements are intrinsically generated during the forming process. This contribution regards the development of the forming process and the design of the forming tool for the single step production of a hybrid part. To this end a forming tool, which combines the thermo-forming and the metal forming process, is developed. The main challenge by designing the tool is the temperature management of the tool elements for the variothermal forming process. The process parameters are determined in basic tests and finite element (FE) simulation studies. On the basis of these investigations a control concept for the steering of the motion axes and the tool temperature is developed. Forming tests are carried out with the developed tool and the manufactured parts are analysed by computer assisted tomography (CT) scans.

  19. Development of Spectrometer Software for Electromagnetic Radiation Measurement and Analysis

    International Nuclear Information System (INIS)

    Mohd Idris Taib; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2013-01-01

    This software was under development using LabVIEW to be using with StellarNet Spectrometer system. StellarNet Spectrometer was supplied with SpectraWiz operating software that can measure spectral data for real-time spectroscopy. This LabVIEW software was used to access real-time data from SpectraWiz dynamic link library as hardware interfacing. This software will acquire amplitude of every electromagnetic wavelength at periodic time. In addition to hardware interfacing, the user interface capabilities of software include plotting of spectral data in various mode including scope, absorbance, transmission and irradiance mode. This software surely can be used for research and development in application, utilization and safety of electromagnetic radiation, especially solar, laser and ultra violet. Of-line capabilities of this software are almost unlimited due to availability of mathematical and signal processing function in the LabVIEW add on library. (author)

  20. Teaching Information Systems Development via Process Variants

    Science.gov (United States)

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  1. Pre- and post construction radon measurements in a new housing development

    International Nuclear Information System (INIS)

    Rydock, J.P.; Naess-Rolstad, A.; Brunsell, J.T.

    2001-01-01

    Results from pre- and post construction radon measurements in a new housing development are presented. The houses were built in an area that had not been previously associated with elevated indoor radon concentrations. Exhalation measurements of gravel and stone from the site and soil gas measurements under several houses did not indicate an elevated radon potential. However, 4 of 21 finished houses (or 19%) exhibited annual average indoor radon concentrations over 200 Bq.m -3 (5.4 pCi/l). The highest concentrations were observed in the first house built in 1 of the 6 houses built differently than the original designs, with the elements of a sub floor ventilation system included for possible radon control if necessary. These results suggest that site investigations can be of limited value in determining where not to include radon protection measures in new housing. Also, that care must be taken to adequately inform everyone involved in the building process of the importance of maintaining a tight seal against the ground to prevent possible radon gas entry into a house. (author)

  2. Development of a Behavioral Performance Measure

    Directory of Open Access Journals (Sweden)

    Marcelo Cabus Klotzle

    2012-09-01

    Full Text Available Since the fifties, several measures have been developed in order to measure the performance of investments or choices involving uncertain outcomes. Much of these measures are based on Expected Utility Theory, but since the nineties a number of measures have been proposed based on Non-Expected Utility Theory. Among the Theories of Non-Expected Utility highlights Prospect Theory, which is the foundation of Behavioral Finance. Based on this theory this study proposes a new performance measure in which are embedded loss aversion along with the likelihood of distortions in the choice of alternatives. A hypothetical example is presented in which various performance measures, including the new measure are compared. The results showed that the ordering of the assets varied depending on the performance measure adopted. According to what was expected, the new performance measure clearly has captured the distortion of probabilities and loss aversion of the decision maker, ie, those assets with the greatest negative deviations from the target were those who had the worst performance.

  3. Development of control and data processing system for CO{sub 2} laser interferometer

    Energy Technology Data Exchange (ETDEWEB)

    Chiba, Shinichi; Kawano, Yasunori; Tsuchiya, Katsuhiko; Inoue, Akira [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2001-11-01

    CO{sub 2} laser interferometer diagnostic has been operating to measure the central electron density in JT-60U plasmas. We have developed a control and data processing system for the CO{sub 2} laser interferometer with flexible functions of data acquisition, data processing and data transfer in accordance with the sequence of JT-60U discharges. This system is mainly composed of two UNIX workstations and CAMAC clusters, in which the high reliability was obtained by sharing the data process functions to the each workstations. Consequently, the control and data processing system becomes to be able to provide electron density data immediately after a JT-60U discharge, routinely. The realtime feedback control of electron density in JT-60U also becomes to be available by using a reference density signal from the CO{sub 2} laser interferometer. (author)

  4. Development of a Draft Core Set of Domains for Measuring Shared Decision Making in Osteoarthritis

    DEFF Research Database (Denmark)

    Toupin-April, Karine; Barton, Jennifer; Fraenkel, Liana

    2015-01-01

    OBJECTIVE: Despite the importance of shared decision making for delivering patient-centered care in rheumatology, there is no consensus on how to measure its process and outcomes. The aim of this Outcome Measures in Rheumatology (OMERACT) working group is to determine the core set of domains...... for measuring shared decision making in intervention studies in adults with osteoarthritis (OA), from the perspectives of patients, health professionals, and researchers. METHODS: We followed the OMERACT Filter 2.0 method to develop a draft core domain set by (1) forming an OMERACT working group; (2) conducting...... a review of domains of shared decision making; and (3) obtaining opinions of all those involved using a modified nominal group process held at a session activity at the OMERACT 12 meeting. RESULTS: In all, 26 people from Europe, North America, and Australia, including 5 patient research partners...

  5. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    Science.gov (United States)

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that

  6. Students’ development in the learning process

    Directory of Open Access Journals (Sweden)

    Vladimir D. Shadrikov

    2012-01-01

    Full Text Available A system genetics approach has been employed to study students’ mental development.Ability development is considered in terms of mastering of intellectualoperations. The study endeavors to identify the components of certain abilitiesconsciously acquired by a student in the process of learning. The study was arrangedin two directions: the teaching of students to master intellectual operationsand use them in their work with training materials, and psychological testingof control and experimental student groups before and after training tests todiagnose the level of intellectual development. The study involved teachers andstudents of primary and secondary school.

  7. A ten-step process to develop case management plans.

    Science.gov (United States)

    Tahan, Hussein A

    2002-01-01

    The use of case management plans has contained cost and improved quality of care successfully. However, the process of developing these plans remains a great challenge for healthcare executives, in this article, the author presents the answer to this challenge by discussing a 10-step formal process that administrators of patient care services and case managers can adapt to their institutions. It also can be used by interdisciplinary team members as a practical guide to develop a specific case management plan. This process is applicable to any care setting (acute, ambulatory, long term, and home care), diagnosis, or procedure. It is particularly important for those organizations that currently do not have a deliberate and systematic process to develop case management plans and are struggling with how to improve the efficiency and productivity of interdisciplinary teams charged with developing case management plans.

  8. Temperature measurement: Development work on noise thermometry and improvement of conventional thermocouples for applications in nuclear process heat (PNP)

    International Nuclear Information System (INIS)

    Brixy, H.; Hecker, R.; Oehmen, J.; Barbonus, P.; Hans, R.

    1982-06-01

    The behaviour was studied of NiCr-Ni sheathed thermocouples (sheath Inconel 600 or Incoloy 800, insulation MgO) in a helium and carbon atmosphere at temperatures of 950-1150 deg. C. All the thermocouples used retained their functional performance. The insulation resistance tended towards a limit value which is dependent on the temperature and quality of the thermocouple. Temperature measurements were loaded with great uncertainty in the temperature range of 950-1150 deg. C. Recalibrations at the temperature of 950 deg. C showed errors of up to 6%. Measuring sensors were developed which consist of a sheathed double thermocouple with a noise resistor positioned between the two hot junctions. Using the noise thermometer it is possible to recalibrate the thermocouple at any time in situ. A helium system with a high temperature experimental area was developed to test the thermocouples and the combined thermocouple-noise thermometer sensors under true experimental conditions

  9. Process Consultation: Its Role in Organization Development.

    Science.gov (United States)

    Schein, Edgar H.

    This volume focuses on the process by which the consultant builds readiness for organizational development (OD) programs, actually conducts training, and works with the key individuals of an organization as part of an OD program. Part I describes in some detail the human processes in organizations--communication, functional roles of group members,…

  10. Development of a scintillator detector set with counter and data acquisition for flow measurements

    CERN Document Server

    Costa, F E D

    2002-01-01

    A portable counter with data acquisition system for flow measurements was developed, using the pulse velocity technique. This consists in determining the tracer transit time mixed homogeneously to the liquid or gas pipelines. The counter comprises: (a) two CsI(Tl) crystals solid state detectors, associated with Si PIN photodiodes, with compatible sensitivity to the injected radiotracers activities; (b) amplification units; (c) analogue-to-digital interface, which processes and displays the detectors counting separately and in real time, but in a same temporal axis, via a computer screen and (d) 30-m coaxial cables for signals transmission from each detector to the processing unit. Experiments were carried out for the detector and associated electronic characterizations. The equipment showed to be suitable for flow measurements in an industrial plant, in the real situation.

  11. Automating Software Development Process using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet; Damiani, Ernesto; Jain, Lakhmi C.; Madravio, Mauro

    2004-01-01

    In this chapter, we aim to highlight how fuzzy logic can be a valid expressive tool to manage the software development process. We characterize a software development method in terms of two major components: artifact types and methodological rules. Classes, attributes, operations, and inheritance

  12. Recent Developments in Abrasive Hybrid Manufacturing Processes

    Directory of Open Access Journals (Sweden)

    Ruszaj Adam

    2017-06-01

    Full Text Available Recent dynamic development of abrasive hybrid manufacturing processes results from application of a new difficult for machining materials and improvement of technological indicators of manufacturing processes already applied in practice. This tendency also occurs in abrasive machining processes which are often supported by ultrasonic vibrations, electrochemical dissolution or by electrical discharges. In the paper we present the review of new results of investigations and new practical applications of Abrasive Electrodischarge (AEDM and Electrochemical (AECM Machining.

  13. Development of anti-inflammatory drugs - the research and development process.

    Science.gov (United States)

    Knowles, Richard Graham

    2014-01-01

    The research and development process for novel drugs to treat inflammatory diseases is described, and several current issues and debates relevant to this are raised: the decline in productivity, attrition, challenges and trends in developing anti-inflammatory drugs, the poor clinical predictivity of experimental models of inflammatory diseases, heterogeneity within inflammatory diseases, 'improving on the Beatles' in treating inflammation, and the relationships between big pharma and biotechs. The pharmaceutical research and development community is responding to these challenges in multiple ways which it is hoped will lead to the discovery and development of a new generation of anti-inflammatory medicines. © 2013 Nordic Pharmacological Society. Published by John Wiley & Sons Ltd.

  14. Measurement of IgG-blocking antibodies: development and application of a radioimmunoassay

    International Nuclear Information System (INIS)

    Sobotka, A.K.; Valentine, M.D.; Ishizaka, K.; Lichtenstein, L.M.

    1976-01-01

    A radioimmunoassay for measuring blocking antibodies has been developed. We used the ragweed antigen E system to show that the same blocking antibodies (IgG) measured by inhibition of antigen-induced leukocyte histamine release were precipitated in the binding assay (r/sub s/ = 0.96 p less than 0.001), thus validating a widely applicable technique for measuring blocking antibodies. Binding of phospholipase-A (Phos-A), the major allergen in honey bee venom, was also shown to correlate significantly with inhibition of histamine release. Hymenoptera (insect) hypersensitivity was used as a model to demonstrate application of the binding assay. Sera obtained from patients undergoing whole body extract therapy contained negligible amounts of specific blocking antibodies. Significantly higher blocking antibody titers to both whole honey bee venom and Phos-A were measured in sera drawn from patients immunized with whole venom. The use of the binding radioimmunoassay should facilitate management of allergic disease processes in which blocking antibodies are thought to be protective

  15. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Automated acquisition and processing of data from measurements on aerodynamic models

    International Nuclear Information System (INIS)

    Mantlik, F.; Pilat, M.; Schmid, J.

    1981-01-01

    Hardware and software are described for processing data measured in the model research of local hydrodynamic conditions in fluid flow through channels with a complex cross sectional geometry, obtained usign aerodynamic models of parts of fast reactor fuel assemblies of the HEM-1 and HEM-2 type. A system was proposed and is being implemented of automatic control of the experiments and measured data acquisition. Basic information is given on the programs for processing and storing the results using a GIER computer. A CAMAC system is primarily used as part of the hardware. (B.S.)

  17. Conception and development of an optical methodology applied to long-distance measurement of suspension bridges dynamic displacement

    International Nuclear Information System (INIS)

    Martins, L Lages; Ribeiro, A Silva; Rebordão, J M

    2013-01-01

    This paper describes the conception and development of an optical system applied to suspension bridge structural monitoring, aiming real-time and long-distance measurement of dynamical three-dimensional displacement, namely, in the central section of the main span. The main innovative issues related to this optical approach are described and a comparison with other optical and non-optical measurement systems is performed. Moreover, a computational simulator tool developed for the optical system design and validation of the implemented image processing and calculation algorithms is also presented

  18. Measuring personal recovery - psychometric properties of the Swedish Questionnaire about the Process of Recovery (QPR-Swe).

    Science.gov (United States)

    Argentzell, Elisabeth; Hultqvist, Jenny; Neil, Sandra; Eklund, Mona

    2017-10-01

    Personal recovery, defined as an individual process towards meaning, is an important target within mental health services. Measuring recovery hence requires reliable and valid measures. The Process of Recovery Questionnaire (QPR) was developed for that purpose. The aim was to develop a Swedish version of the QPR (QPR-Swe) and explore its psychometric properties in terms of factor structure, internal consistency, construct validity and sensitivity to change. A total of 226 participants entered the study. The factor structure was investigated by Principal Component Analysis and Scree plot. Construct validity was addressed in terms of convergent validity against indicators of self-mastery, self-esteem, quality of life and self-rated health. A one-factor solution of QPR-Swe received better support than a two-factor solution. Good internal consistency was indicated, α = 0.92, and construct validity was satisfactory. The QPR-Swe showed preliminary sensitivity to change. The QPR-Swe showed promising initial psychometric properties in terms of internal consistency, convergent validity and sensitivity to change. The QPR-Swe is recommended for use in research and clinical contexts to assess personal recovery among people with mental illness.

  19. Infant Movement Motivation Questionnaire: development of a measure evaluating infant characteristics relating to motor development in the first year of life.

    Science.gov (United States)

    Doralp, Samantha; Bartlett, Doreen

    2014-08-01

    This paper highlights the development and testing of the Infant Movement Motivation Questionnaire (IMMQ), an instrument designed to evaluate qualities of infant characteristics that relate specifically to early motor development. The measurement development process included three phases: item generation, pilot testing and evaluation of acceptability and feasibility for parents and exploratory factor analysis. The resultant 27-item questionnaire is designed for completion by parents and contains four factors including Activity, Exploration, Motivation and Adaptability. Overall, the internal consistency of the IMMQ is 0.89 (Cronbach's alpha), with test-retest reliability measured at 0.92 (ICC, with 95% CI 0.83-0.96). Further work could be done to strengthen the individual factors; however it is adequate for use in its full form. The IMMQ can be used for clinical or research purposes, as well as an educational tool for parents. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. The European Society of Gastrointestinal Endoscopy Quality Improvement Initiative: developing performance measures.

    Science.gov (United States)

    Rutter, Matthew D; Senore, Carlo; Bisschops, Raf; Domagk, Dirk; Valori, Roland; Kaminski, Michal F; Spada, Cristiano; Bretthauer, Michael; Bennett, Cathy; Bellisario, Cristina; Minozzi, Silvia; Hassan, Cesare; Rees, Colin; Dinis-Ribeiro, Mário; Hucl, Tomas; Ponchon, Thierry; Aabakken, Lars; Fockens, Paul

    2016-02-01

    The European Society of Gastrointestinal Endoscopy (ESGE) and United European Gastroenterology (UEG) have a vision to create a thriving community of endoscopy services across Europe, collaborating with each other to provide high quality, safe, accurate, patient-centered and accessible endoscopic care. Whilst the boundaries of what can be achieved by advanced endoscopy are continually expanding, we believe that one of the most fundamental steps to achieving our goal is to raise the quality of everyday endoscopy. The development of robust, consensus- and evidence-based key performance measures is the first step in this vision. ESGE and UEG have identified quality of endoscopy as a major priority. This paper explains the rationale behind the ESGE Quality Improvement Initiative and describes the processes that were followed. We recommend that all units develop mechanisms for audit and feedback of endoscopist and service performance using the ESGE performance measures that will be published in future issues of this journal over the next year. We urge all endoscopists and endoscopy services to prioritize quality and to ensure that these performance measures are implemented and monitored at a local level, so that we can provide the highest possible care for our patients.