WorldWideScience

Sample records for conditions measurement methodology

  1. Methodology for measurement of diesel particle size distributions from a city bus working in real traffic conditions

    International Nuclear Information System (INIS)

    Armas, O; Gómez, A; Mata, C

    2011-01-01

    The study of particulate matter (PM) and nitrogen oxides emissions of diesel engines is nowadays a necessary step towards pollutant emission reduction. For a complete evaluation of PM emissions and its size characterization, one of the most challenging goals is to adapt the available techniques and the data acquisition procedures to the measurement and to propose a methodology for the interpretation of instantaneous particle size distributions (PSD) of combustion-derived particles produced by a vehicle during real driving conditions. In this work, PSD from the exhaust gas of a city bus operated in real driving conditions with passengers have been measured. For the study, the bus was equipped with a rotating disk diluter coupled to an air supply thermal conditioner (with an evaporating tube), the latter being connected to a TSI Engine Exhaust Particle Sizer spectrometer. The main objective of this work has been to propose an alternative procedure for evaluating the influence of several transient sequences on PSD emitted by a city bus used in real driving conditions with passengers. The transitions studied were those derived from the combination of four possible sequences or categories during real driving conditions: idle, acceleration, deceleration with fuel consumption and deceleration without fuel consumption. The analysis methodology used in this work proved to be a useful tool for a better understanding of the phenomena related to the determination of PSD emitted by a city bus during real driving conditions with passengers

  2. Methodology for measurement of diesel particle size distributions from a city bus working in real traffic conditions

    Science.gov (United States)

    Armas, O.; Gómez, A.; Mata, C.

    2011-10-01

    The study of particulate matter (PM) and nitrogen oxides emissions of diesel engines is nowadays a necessary step towards pollutant emission reduction. For a complete evaluation of PM emissions and its size characterization, one of the most challenging goals is to adapt the available techniques and the data acquisition procedures to the measurement and to propose a methodology for the interpretation of instantaneous particle size distributions (PSD) of combustion-derived particles produced by a vehicle during real driving conditions. In this work, PSD from the exhaust gas of a city bus operated in real driving conditions with passengers have been measured. For the study, the bus was equipped with a rotating disk diluter coupled to an air supply thermal conditioner (with an evaporating tube), the latter being connected to a TSI Engine Exhaust Particle Sizer spectrometer. The main objective of this work has been to propose an alternative procedure for evaluating the influence of several transient sequences on PSD emitted by a city bus used in real driving conditions with passengers. The transitions studied were those derived from the combination of four possible sequences or categories during real driving conditions: idle, acceleration, deceleration with fuel consumption and deceleration without fuel consumption. The analysis methodology used in this work proved to be a useful tool for a better understanding of the phenomena related to the determination of PSD emitted by a city bus during real driving conditions with passengers.

  3. The impact of methodology in innovation measurement

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, L.; Bugge, M.; Solberg, E.

    2016-07-01

    Innovation surveys and rankings such as the Community Innovation Survey (CIS) and Innovation Union Scoreboard (IUS) have developed into influential diagnostic tools that are often used to categorize countries according to their innovation performance and to legitimise innovation policies. Although a number of ongoing processes are seeking to improve existing frameworks for measuring innovation, there are large methodological differences across countries in the way innovation is measured. This causes great uncertainty regarding a) the coherence between data from innovation surveys, b) actual innovativeness of the economy, and c) the validity of research based on innovation data. Against this background we explore empirically how different survey methods for measuring innovation affect reported innovation performance. The analysis is based on a statistical exercise comparing the results from three different methodological versions of the same survey for measuring innovation in the business enterprise sector in Norway. We find striking differences in reported innovation performance depending on how the surveys are carried out methodologically. The paper concludes that reported innovation performance is highly sensitive to and strongly conditioned by methodological context. This represents a need for increased caution and awareness around data collection and research based on innovation data, and not least in terms of aggregation of data and cross-country comparison. (Author)

  4. Site-conditions map for Portugal based on VS measurements: methodology and final model

    Science.gov (United States)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and

  5. A novelty detection diagnostic methodology for gearboxes operating under fluctuating operating conditions using probabilistic techniques

    Science.gov (United States)

    Schmidt, S.; Heyns, P. S.; de Villiers, J. P.

    2018-02-01

    In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.

  6. Methodology of external exposure calculation for reuse of conditional released materials from decommissioning - 59138

    International Nuclear Information System (INIS)

    Ondra, Frantisek; Vasko, Marek; Necas, Vladimir

    2012-01-01

    The article presents methodology of external exposure calculation for reuse of conditional released materials from decommissioning using VISIPLAN 3D ALARA planning tool. Production of rails has been used as an example application of proposed methodology within the CONRELMAT project. The article presents a methodology for determination of radiological, material, organizational and other conditions for conditionally released materials reuse to ensure that workers and public exposure does not breach the exposure limits during scenario's life cycle (preparation, construction and operation of scenario). The methodology comprises a proposal of following conditions in the view of workers and public exposure: - radionuclide limit concentration of conditionally released materials for specific scenarios and nuclide vectors, - specific deployment of conditionally released materials eventually shielding materials, workers and public during the scenario's life cycle, - organizational measures concerning time of workers or public stay in the vicinity on conditionally released materials for individual performed scenarios and nuclide vectors. The above mentioned steps of proposed methodology have been applied within the CONRELMAT project. Exposure evaluation of workers for rail production is introduced in the article as an example of this application. Exposure calculation using VISIPLAN 3D ALARA planning tool was done within several models. The most exposed profession for scenario was identified. On the basis of this result, an increase of radionuclide concentration in conditional released material was proposed more than two times to 681 Bq/kg without no additional safety or organizational measures being applied. After application of proposed safety and organizational measures (additional shielding, geometry changes and limitation of work duration) it is possible to increase concentration of radionuclide in conditional released material more than ten times to 3092 Bq/kg. Storage

  7. The Piston Compressor: The Methodology of the Real-Time Condition Monitoring

    International Nuclear Information System (INIS)

    Naumenko, A P; Kostyukov, V N

    2012-01-01

    The methodology of a diagnostic signal processing, a function chart of the monitoring system are considered in the article. The methodology of monitoring and diagnosing is based on measurement of indirect processes' parameters (vibroacoustic oscillations) therefore no more than five sensors is established on the cylinder, measurement of direct structural and thermodynamic parameters is envisioned as well. The structure and principle of expert system's functioning of decision-making is given. Algorithm of automatic expert system includes the calculation diagnostic attributes values based on their normative values, formation sets of diagnostic attributes that correspond to individual classes to malfunction, formation of expert system messages. The scheme of a real-time condition monitoring system for piston compressors is considered. The system have consistently-parallel structure of information-measuring equipment, which allows to measure the vibroacoustic signal for condition monitoring of reciprocating compressors and modes of its work. Besides, the system allows to measure parameters of other physical processes, for example, system can measure and use for monitoring and statements of the diagnosis the pressure in decreasing spaces (the indicator diagram), the inlet pressure and flowing pressure of each cylinder, inlet and delivery temperature of gas, valves temperature, position of a rod, leakage through compression packing and others.

  8. A methodology to measure the degre of managerial innovation

    Directory of Open Access Journals (Sweden)

    Mustafa Batuhan Ayhan

    2014-01-01

    Full Text Available Purpose: The main objective of this study is to introduce the concept of managerial innovation and to propose a quantitative methodology to measure the degree of managerial innovation capability by analyzing the evolution of the techniques used for management functions.Design/methodology/approach: The methodology mainly focuses on the different techniques used for each management functions namely; Planning, Organizing, Leading, Controlling and Coordinating. These functions are studied and the different techniques used for them are listed. Since the techniques used for these management functions evolve in time due to technological and social changes, a methodology is required to measure the degree of managerial innovation capability. This competency is measured through an analysis performed to point out which techniques used for each of these functions.Findings: To check the validity and applicability of this methodology, it is implemented to a manufacturing company. Depending on the results of the implementation, enhancements are suggested to the company for each function to survive in the changing managerial conditionsResearch limitations/implications: The primary limitation of this study is the implementation area. Although the study is implemented in just a single manufacturing company, it is welcomed to apply the same methodology to measure the managerial innovation capabilities of other manufacturing companies. Moreover, the model is ready to be adapted to different sectors although it is mainly prepared for manufacturing sector.Originality/value: Although innovation management is widely studied, managerial innovation is a new concept and introduced to measure the capability to challenge the changes occur in managerial functions. As a brief this methodology aims to be a pioneer in the field of managerial innovation regarding the evolution of management functions. Therefore it is expected to lead more studies to inspect the progress of

  9. Methodology for Multileaf Collimator Quality Assurance in clinical conditions

    International Nuclear Information System (INIS)

    Diaz M, R. M.; Rodriguez Z, M.; Juarez D, A.; Romero R, R.

    2013-01-01

    Multileaf Collimators (MLCs) have become an important technological advance as part of clinical linear accelerators (linacs) for radiotherapy. Treatment planning and delivery were substantially modified after these devices. However, it was needed to develop Quality Assurance (QA) methodologies related to the performance of these developments. The most common methods for QA of MLC are made in basic conditions that hardly cover all possible difficulties in clinical practice. Diaz et. el. developed a methodology based upon volumetric detectors bidimensional arrays that can be extended to more demanding situations. In this work, the Auril methodology of Diaz et. al. was implemented to the irradiation with the linac gantry in horizontal position. A mathematical procedure was developed to ease the dosimetric centering of the device with the Auril centering tool. System calibration was made as in the typical Auril methodology. Patterns with leaf misplacements in known positions were irradiated. the method allowed the detection of leafs' misplacements with a minimum number of false positives. We concluded that Auril methodology can be applied in clinical conditions. (Author)

  10. Development of the methodology of exhaust emissions measurement under RDE (Real Driving Emissions) conditions for non-road mobile machinery (NRMM) vehicles

    Science.gov (United States)

    Merkisz, J.; Lijewski, P.; Fuc, P.; Siedlecki, M.; Ziolkowski, A.

    2016-09-01

    The paper analyzes the exhaust emissions from farm vehicles based on research performed under field conditions (RDE) according to the NTE procedure. This analysis has shown that it is hard to meet the NTE requirements under field conditions (engine operation in the NTE zone for at least 30 seconds). Due to a very high variability of the engine conditions, the share of a valid number of NTE windows in the field test is small throughout the entire test. For this reason, a modification of the measurement and exhaust emissions calculation methodology has been proposed for farm vehicles of the NRMM group. A test has been developed composed of the following phases: trip to the operation site (paved roads) and field operations (including u-turns and maneuvering). The range of the operation time share in individual test phases has been determined. A change in the method of calculating the real exhaust emissions has also been implemented in relation to the NTE procedure.

  11. Structural health monitoring methodology for aircraft condition-based maintenance

    Science.gov (United States)

    Saniger, Jordi; Reithler, Livier; Guedra-Degeorges, Didier; Takeda, Nobuo; Dupuis, Jean Pierre

    2001-06-01

    Reducing maintenance costs while keeping a constant level of safety is a major issue for Air Forces and airlines. The long term perspective is to implement condition based maintenance to guarantee a constant safety level while decreasing maintenance costs. On this purpose, the development of a generalized Structural Health Monitoring System (SHMS) is needed. The objective of such a system is to localize the damages and to assess their severity, with enough accuracy to allow low cost corrective actions. The present paper describes a SHMS based on acoustic emission technology. This choice was driven by its reliability and wide use in the aerospace industry. The described SHMS uses a new learning methodology which relies on the generation of artificial acoustic emission events on the structure and an acoustic emission sensor network. The calibrated acoustic emission events picked up by the sensors constitute the knowledge set that the system relies on. With this methodology, the anisotropy of composite structures is taken into account, thus avoiding the major cause of errors of classical localization methods. Moreover, it is adaptive to different structures as it does not rely on any particular model but on measured data. The acquired data is processed and the event's location and corrected amplitude are computed. The methodology has been demonstrated and experimental tests on elementary samples presented a degree of accuracy of 1cm.

  12. Performance Evaluation and Measurement of the Organization in Strategic Analysis and Control: Methodological Aspects

    OpenAIRE

    Živan Ristić; Neđo Balaban

    2006-01-01

    Information acquired by measuring and evaluation are a necessary condition for good decision-making in strategic management. This work deals with : (a) Methodological aspects of evaluation (kinds of evaluation, metaevaluation) and measurement (supposition of isomorphism in measurement, kinds and levels of measurement, errors in measurement and the basic characteristics of measurement) (b) Evaluation and measurement of potential and accomplishments of the organization in Kaplan-Norton perspect...

  13. Methodology and boundary conditions applied to the analysis on internal flooding for Kozloduy NPP units 5 and 6

    International Nuclear Information System (INIS)

    Demireva, E.; Goranov, S.; Horstmann, R.

    2004-01-01

    Within the Modernization Program of Units 5 and 6 of Kozloduy NPP a comprehensive analysis of internal flooding has been carried out for the reactor building outside the containment and for the turbine hall by FRAMATOME ANP and ENPRO Consult. The objective of this presentation is to provide information on the applied methodology and boundary conditions. A separate report called 'Methodology and boundary conditions' has been elaborated in order to provide the fundament for the study. The methodology report provides definitions and advice for the following topics: scope of the study; safety objectives; basic assumptions and postulates (plant conditions, grace periods for manual actions, single failure postulate, etc.); sources of flooding (postulated piping leaks and ruptures, malfunctions and personnel error); main activities of the flooding analysis; study conclusions and suggestions of remedial measures. (authors)

  14. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  15. Methodology for reliability based condition assessment

    International Nuclear Information System (INIS)

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period

  16. Measurements of Gluconeogenesis and Glycogenolysis: A Methodological Review.

    Science.gov (United States)

    Chung, Stephanie T; Chacko, Shaji K; Sunehag, Agneta L; Haymond, Morey W

    2015-12-01

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to determine gluconeogenesis is by measuring the incorporation of deuterium from the body water pool into newly formed glucose. However, several techniques using radioactive and stable-labeled isotopes have been used to quantitate the contribution and regulation of gluconeogenesis in humans. Each method has its advantages, methodological assumptions, and set of propagated errors. In this review, we examine the strengths and weaknesses of the most commonly used stable isotopes methods to measure gluconeogenesis in vivo. We discuss the advantages and limitations of each method and summarize the applicability of these measurements in understanding normal and pathophysiological conditions. © 2015 by the American Diabetes Association. Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered.

  17. Methodology for determining time-dependent mechanical properties of tuff subjected to near-field repository conditions

    International Nuclear Information System (INIS)

    Blacic, J.D.; Andersen, R.

    1983-01-01

    We have established a methodology to determine the time dependence of strength and transport properties of tuff under conditions appropriate to a nuclear waste repository. Exploratory tests to determine the approximate magnitudes of thermomechanical property changes are nearly complete. In this report we describe the capabilities of an apparatus designed to precisely measure the time-dependent deformation and permeability of tuff at simulated repository conditions. Preliminary tests with this new apparatus indicate that microclastic creep failure of tuff occurs over a narrow strain range with little precursory Tertiary creep behavior. In one test, deformation under conditions of slowly decreasing effective pressure resulted in failure, whereas some strain indicators showed a decreasing rate of strain

  18. Methodology for determining time-dependent mechanical properties of tuff subjected to near-field repository conditions

    Energy Technology Data Exchange (ETDEWEB)

    Blacic, J.D.; Andersen, R.

    1983-01-01

    We have established a methodology to determine the time dependence of strength and transport properties of tuff under conditions appropriate to a nuclear waste repository. Exploratory tests to determine the approximate magnitudes of thermomechanical property changes are nearly complete. In this report we describe the capabilities of an apparatus designed to precisely measure the time-dependent deformation and permeability of tuff at simulated repository conditions. Preliminary tests with this new apparatus indicate that microclastic creep failure of tuff occurs over a narrow strain range with little precursory Tertiary creep behavior. In one test, deformation under conditions of slowly decreasing effective pressure resulted in failure, whereas some strain indicators showed a decreasing rate of strain.

  19. A methodology to measure the degre of managerial innovation

    OpenAIRE

    Ayhan, Mustafa Batuhan; Oztemel, Ercan

    2014-01-01

    Purpose: The main objective of this study is to introduce the concept of managerial innovation and to propose a quantitative methodology to measure the degree of managerial innovation capability by analyzing the evolution of the techniques used for management functions.Design/methodology/approach: The methodology mainly focuses on the different techniques used for each management functions namely; Planning, Organizing, Leading, Controlling and Coordinating. These functions are studied and the...

  20. Development and Attestation of Gamma-Ray Measurement Methodologies for use by Rostekhnadzor Inspectors in the Russian Federation

    International Nuclear Information System (INIS)

    Jeff Sanders

    2006-01-01

    Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revision of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation

  1. Methodological aspects of EEG and Body dynamics measurements during motion.

    Directory of Open Access Journals (Sweden)

    Pedro eReis

    2014-03-01

    Full Text Available EEG involves recording, analysis, and interpretation of voltages recorded on the human scalp originating from brain grey matter. EEG is one of the favorite methods to study and understand processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements, that are performed in response to the environment. However, there are methodological difficulties when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions of how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determination of real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks.

  2. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    Science.gov (United States)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a

  3. Methodology for interpretation of fissile mass flow measurements

    International Nuclear Information System (INIS)

    March-Leuba, J.; Mattingly, J.K.; Mullens, J.A.

    1997-01-01

    This paper describes a non-intrusive measurement technique to monitor the mass flow rate of fissile material in gaseous or liquid streams. This fissile mass flow monitoring system determines the fissile mass flow rate by relying on two independent measurements: (1) a time delay along a given length of pipe, which is inversely proportional to the fissile material flow velocity, and (2) an amplitude measurement, which is proportional to the fissile concentration (e.g., grams of 235 U per length of pipe). The development of this flow monitor was first funded by DOE/NE in September 95, and initial experimental demonstration by ORNL was described in the 37th INMM meeting held in July 1996. This methodology was chosen by DOE/NE for implementation in November 1996; it has been implemented in hardware/software and is ready for installation. This paper describes the methodology used to interpret the data measured by the fissile mass flow monitoring system and the models used to simulate the transport of fission fragments from the source location to the detectors

  4. Determination of Critical Conditions for Puncturing Almonds Using Coupled Response Surface Methodology and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Mahmood Mahmoodi-Eshkaftaki

    2013-01-01

    Full Text Available In this study, the effect of seed moisture content, probe diameter and loading velocity (puncture conditions on some mechanical properties of almond kernel and peeled almond kernel is considered to model a relationship between the puncture conditions and rupture energy. Furthermore, distribution of the mechanical properties is determined. The main objective is to determine the critical values of mechanical properties significant for peeling machines. The response surface methodology was used to find the relationship between the input parameters and the output responses, and the fitness function was applied to measure the optimal values using the genetic algorithm. Two-parameter Weibull function was used to describe the distribution of mechanical properties. Based on the Weibull parameter values, i.e. shape parameter (β and scale parameter (η calculated for each property, the mechanical distribution variations were completely described and it was confirmed that the mechanical properties are rule governed, which makes the Weibull function suitable for estimating their distributions. The energy model estimated using response surface methodology shows that the mechanical properties relate exponentially to the moisture, and polynomially to the loading velocity and probe diameter, which enabled successful estimation of the rupture energy (R²=0.94. The genetic algorithm calculated the critical values of seed moisture, probe diameter, and loading velocity to be 18.11 % on dry mass basis, 0.79 mm, and 0.15 mm/min, respectively, and optimum rupture energy of 1.97·10-³ J. These conditions were used for comparison with new samples, where the rupture energy was experimentally measured to be 2.68 and 2.21·10-³ J for kernel and peeled kernel, respectively, which was nearly in agreement with our model results.

  5. Risk importance measures in the dynamic flowgraph methodology

    International Nuclear Information System (INIS)

    Tyrväinen, T.

    2013-01-01

    This paper presents new risk importance measures applicable to a dynamic reliability analysis approach with multi-state components. Dynamic reliability analysis methods are needed because traditional methods, such as fault tree analysis, can describe system's dynamical behaviour only in limited manner. Dynamic flowgraph methodology (DFM) is an approach used for analysing systems with time dependencies and feedback loops. The aim of DFM is to identify root causes of a top event, usually representing the system's failure. Components of DFM models are analysed at discrete time points and they can have multiple states. Traditional risk importance measures developed for static and binary logic are not applicable to DFM as such. Some importance measures have previously been developed for DFM but their ability to describe how components contribute to the top event is fairly limited. The paper formulates dynamic risk importance measures that measure the importances of states of components and take the time-aspect of DFM into account in a logical way that supports the interpretation of results. Dynamic risk importance measures are developed as generalisations of the Fussell-Vesely importance and the risk increase factor. -- Highlights: • New risk importance measures are developed for the dynamic flowgraph methodology. • Dynamic risk importance measures are formulated for states of components. • An approach to handle failure modes of a component in DFM is presented. • Dynamic risk importance measures take failure times into account. • Component's influence on the system's reliability can be analysed in detail

  6. Characteristic Rain Events: A Methodology for Improving the Amenity Value of Stormwater Control Measures

    DEFF Research Database (Denmark)

    Smit Andersen, Jonas; Lerer, Sara Maria; Backhaus, Antje

    2017-01-01

    Local management of rainwater using stormwater control measures (SCMs) is gaining increased attention as a sustainable alternative and supplement to traditional sewer systems. Besides offering added utility values, many SCMs also offer a great potential for added amenity values. One way...... of achieving amenity value is to stage the rainwater and thus bring it to the attention of the public. We present here a methodology for creating a selection of rain events that can help bridge between engineering and landscape architecture when dealing with staging of rainwater. The methodology uses......; here we show its use for Danish conditions. We illustrate with a case study how CREs can be used in combination with a simple hydrological model to visualize where, how deep and for how long water is visible in a landscape designed to manage rainwater....

  7. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    Energy Technology Data Exchange (ETDEWEB)

    Tarifeño-Saldivia, Ariel, E-mail: atarifeno@cchen.cl, E-mail: atarisal@gmail.com; Pavez, Cristian; Soto, Leopoldo [Comisión Chilena de Energía Nuclear, Casilla 188-D, Santiago (Chile); Center for Research and Applications in Plasma Physics and Pulsed Power, P4, Santiago (Chile); Departamento de Ciencias Fisicas, Facultad de Ciencias Exactas, Universidad Andres Bello, Republica 220, Santiago (Chile); Mayer, Roberto E. [Instituto Balseiro and Centro Atómico Bariloche, Comisión Nacional de Energía Atómica and Universidad Nacional de Cuyo, San Carlos de Bariloche R8402AGP (Argentina)

    2014-01-15

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  8. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    International Nuclear Information System (INIS)

    Tarifeño-Saldivia, Ariel; Pavez, Cristian; Soto, Leopoldo; Mayer, Roberto E.

    2014-01-01

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods

  9. Methodology applied in Cuba for siting, designing, and building a radioactive waste repository under safety conditions

    International Nuclear Information System (INIS)

    Orbera, L.; Peralta, J.L.; Franklin, R.; Gil, R.; Chales, G.; Rodriguez, A.

    1993-01-01

    The work presents the methodology used in Cuba for siting, designing, and building a radioactive waste repository safely. This methodology covers both the technical and socio-economic factors, as well as those of design and construction so as to have a safe siting for this kind of repository under Cuba especial condition. Applying this methodology will results in a safe repository

  10. Methodology for performing measurements to release material from radiological control

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1993-09-01

    This report describes the existing and proposed methodologies for performing measurements of contamination prior to releasing material for uncontrolled use at the Hanford Site. The technical basis for the proposed methodology, a modification to the existing contamination survey protocol, is also described. The modified methodology, which includes a large-area swipe followed by a statistical survey, can be used to survey material that is unlikely to be contaminated for release to controlled and uncontrolled areas. The material evaluation procedure that is used to determine the likelihood of contamination is also described

  11. Bayesian Semiparametric Density Deconvolution in the Presence of Conditionally Heteroscedastic Measurement Errors

    KAUST Repository

    Sarkar, Abhra

    2014-10-02

    We consider the problem of estimating the density of a random variable when precise measurements on the variable are not available, but replicated proxies contaminated with measurement error are available for sufficiently many subjects. Under the assumption of additive measurement errors this reduces to a problem of deconvolution of densities. Deconvolution methods often make restrictive and unrealistic assumptions about the density of interest and the distribution of measurement errors, e.g., normality and homoscedasticity and thus independence from the variable of interest. This article relaxes these assumptions and introduces novel Bayesian semiparametric methodology based on Dirichlet process mixture models for robust deconvolution of densities in the presence of conditionally heteroscedastic measurement errors. In particular, the models can adapt to asymmetry, heavy tails and multimodality. In simulation experiments, we show that our methods vastly outperform a recent Bayesian approach based on estimating the densities via mixtures of splines. We apply our methods to data from nutritional epidemiology. Even in the special case when the measurement errors are homoscedastic, our methodology is novel and dominates other methods that have been proposed previously. Additional simulation results, instructions on getting access to the data set and R programs implementing our methods are included as part of online supplemental materials.

  12. Bayesian Semiparametric Density Deconvolution in the Presence of Conditionally Heteroscedastic Measurement Errors

    KAUST Repository

    Sarkar, Abhra; Mallick, Bani K.; Staudenmayer, John; Pati, Debdeep; Carroll, Raymond J.

    2014-01-01

    We consider the problem of estimating the density of a random variable when precise measurements on the variable are not available, but replicated proxies contaminated with measurement error are available for sufficiently many subjects. Under the assumption of additive measurement errors this reduces to a problem of deconvolution of densities. Deconvolution methods often make restrictive and unrealistic assumptions about the density of interest and the distribution of measurement errors, e.g., normality and homoscedasticity and thus independence from the variable of interest. This article relaxes these assumptions and introduces novel Bayesian semiparametric methodology based on Dirichlet process mixture models for robust deconvolution of densities in the presence of conditionally heteroscedastic measurement errors. In particular, the models can adapt to asymmetry, heavy tails and multimodality. In simulation experiments, we show that our methods vastly outperform a recent Bayesian approach based on estimating the densities via mixtures of splines. We apply our methods to data from nutritional epidemiology. Even in the special case when the measurement errors are homoscedastic, our methodology is novel and dominates other methods that have been proposed previously. Additional simulation results, instructions on getting access to the data set and R programs implementing our methods are included as part of online supplemental materials.

  13. A statistical methodology for the estimation of extreme wave conditions for offshore renewable applications

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Kalogeri, Christina; Galanis, George

    2015-01-01

    and post-process outputs from a high resolution numerical wave modeling system for extreme wave estimation based on the significant wave height. This approach is demonstrated through the data analysis at a relatively deep water site, FINO 1, as well as a relatively shallow water area, coastal site Horns...... as a characteristic index of extreme wave conditions. The results from the proposed methodology seem to be in a good agreement with the measurements at both the relatively deep, open water and the shallow, coastal water sites, providing a potentially useful tool for offshore renewable energy applications. © 2015...... Rev, which is located in the North Sea, west of Denmark. The post-processing targets at correcting the modeled time series of the significant wave height, in order to match the statistics of the corresponding measurements, including not only the conventional parameters such as the mean and standard...

  14. Advanced quantitative measurement methodology in physics education research

    Science.gov (United States)

    Wang, Jing

    parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The

  15. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    Science.gov (United States)

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  16. Gust factor based on research aircraft measurements: A new methodology applied to the Arctic marine boundary layer

    DEFF Research Database (Denmark)

    Suomi, Irene; Lüpkes, Christof; Hartmann, Jörg

    2016-01-01

    There is as yet no standard methodology for measuring wind gusts from a moving platform. To address this, we have developed a method to derive gusts from research aircraft data. First we evaluated four different approaches, including Taylor's hypothesis of frozen turbulence, to derive the gust...... in unstable conditions (R2=0.52). The mean errors for all methods were low, from -0.02 to 0.05, indicating that wind gust factors can indeed be measured from research aircraft. Moreover, we showed that aircraft can provide gust measurements within the whole boundary layer, if horizontal legs are flown...

  17. Characteristic Rain Events: A Methodology for Improving the Amenity Value of Stormwater Control Measures

    Directory of Open Access Journals (Sweden)

    Jonas Smit Andersen

    2017-10-01

    Full Text Available Local management of rainwater using stormwater control measures (SCMs is gaining increased attention as a sustainable alternative and supplement to traditional sewer systems. Besides offering added utility values, many SCMs also offer a great potential for added amenity values. One way of achieving amenity value is to stage the rainwater and thus bring it to the attention of the public. We present here a methodology for creating a selection of rain events that can help bridge between engineering and landscape architecture when dealing with staging of rainwater. The methodology uses quantitative and statistical methods to select Characteristic Rain Events (CREs for a range of frequent return periods: weekly, bi-weekly, monthly, bi-monthly, and a single rarer event occurring only every 1–10 years. The methodology for selecting CREs is flexible and can be adjusted to any climatic settings; here we show its use for Danish conditions. We illustrate with a case study how CREs can be used in combination with a simple hydrological model to visualize where, how deep and for how long water is visible in a landscape designed to manage rainwater.

  18. Methodology for quantitative assessment of technical condition in industrial systems

    Energy Technology Data Exchange (ETDEWEB)

    Steinbach, C [Marintek AS (Norway); Soerli, A [Statoil (Norway)

    1999-12-31

    As part of the Eureka project Ageing Management a methodology has been developed to assess the technical condition of industrial systems. The first part of the presentation argues for the use of technical condition parameters in the context of maintenance strategies. Thereafter the term `technical condition` is defined more thoroughly as it is used within the project. It is claimed that the technical condition of a system - such as a feed water system of a nuclear power plant, or a water injection system on an oil platform - may be determined by aggregating the condition of its smaller components using a hierarchic approach. The hierarchy has to be defined in co-operation with experienced personnel and reflects the impact of degradation of elements on a lower level to nodes higher in the hierarchy. The impact is divided into five categories with respect to safety, environment, availability, costs and man-hours. To determine the technical condition of the bottom elements of the hierarchy, available data is used from both an on-line condition monitoring system and maintenance history. The second part of the presentation introduces the prototype software tool TeCoMan which utilises the theory and applies it to installations of the participating companies. First results and gained experiences with the method and tool are discussed. (orig.)

  19. Methodology for quantitative assessment of technical condition in industrial systems

    Energy Technology Data Exchange (ETDEWEB)

    Steinbach, C. [Marintek AS (Norway); Soerli, A. [Statoil (Norway)

    1998-12-31

    As part of the Eureka project Ageing Management a methodology has been developed to assess the technical condition of industrial systems. The first part of the presentation argues for the use of technical condition parameters in the context of maintenance strategies. Thereafter the term `technical condition` is defined more thoroughly as it is used within the project. It is claimed that the technical condition of a system - such as a feed water system of a nuclear power plant, or a water injection system on an oil platform - may be determined by aggregating the condition of its smaller components using a hierarchic approach. The hierarchy has to be defined in co-operation with experienced personnel and reflects the impact of degradation of elements on a lower level to nodes higher in the hierarchy. The impact is divided into five categories with respect to safety, environment, availability, costs and man-hours. To determine the technical condition of the bottom elements of the hierarchy, available data is used from both an on-line condition monitoring system and maintenance history. The second part of the presentation introduces the prototype software tool TeCoMan which utilises the theory and applies it to installations of the participating companies. First results and gained experiences with the method and tool are discussed. (orig.)

  20. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  1. The Methodology of Doppler-Derived Central Blood Flow Measurements in Newborn Infants

    Directory of Open Access Journals (Sweden)

    Koert A. de Waal

    2012-01-01

    Full Text Available Central blood flow (CBF measurements are measurements in and around the heart. It incorporates cardiac output, but also measurements of cardiac input and assessment of intra- and extracardiac shunts. CBF can be measured in the central circulation as right or left ventricular output (RVO or LVO and/or as cardiac input measured at the superior vena cava (SVC flow. Assessment of shunts incorporates evaluation of the ductus arteriosus and the foramen ovale. This paper describes the methodology of CBF measurements in newborn infants. It provides a brief overview of the evolution of Doppler ultrasound blood flow measurements, basic principles of Doppler ultrasound, and an overview of all used methodology in the literature. A general guide for interpretation and normal values with suggested cutoffs of CBFs are provided for clinical use.

  2. Don't fear 'fear conditioning': Methodological considerations for the design and analysis of studies on human fear acquisition, extinction, and return of fear.

    Science.gov (United States)

    Lonsdorf, Tina B; Menz, Mareike M; Andreatta, Marta; Fullana, Miguel A; Golkar, Armita; Haaker, Jan; Heitland, Ivo; Hermann, Andrea; Kuhn, Manuel; Kruse, Onno; Meir Drexler, Shira; Meulders, Ann; Nees, Frauke; Pittig, Andre; Richter, Jan; Römer, Sonja; Shiban, Youssef; Schmitz, Anja; Straube, Benjamin; Vervliet, Bram; Wendt, Julia; Baas, Johanna M P; Merz, Christian J

    2017-06-01

    The so-called 'replicability crisis' has sparked methodological discussions in many areas of science in general, and in psychology in particular. This has led to recent endeavours to promote the transparency, rigour, and ultimately, replicability of research. Originating from this zeitgeist, the challenge to discuss critical issues on terminology, design, methods, and analysis considerations in fear conditioning research is taken up by this work, which involved representatives from fourteen of the major human fear conditioning laboratories in Europe. This compendium is intended to provide a basis for the development of a common procedural and terminology framework for the field of human fear conditioning. Whenever possible, we give general recommendations. When this is not feasible, we provide evidence-based guidance for methodological decisions on study design, outcome measures, and analyses. Importantly, this work is also intended to raise awareness and initiate discussions on crucial questions with respect to data collection, processing, statistical analyses, the impact of subtle procedural changes, and data reporting specifically tailored to the research on fear conditioning. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Methodology for measurement in schools and kindergartens: experiences

    International Nuclear Information System (INIS)

    Fotjikova, I.; Navratilova Rovenska, K.

    2015-01-01

    In more than 1500 schools and preschool facilities, long-term radon measurement was carried out in the last 3 y. The negative effect of thermal retrofitting on the resulting long-term radon averages is evident. In some of the facilities, low ventilation rates and correspondingly high radon levels were found, so it was recommended to change ventilation habits. However, some of the facilities had high radon levels due to its ingress from soil gas. Technical measures should be undertaken to reduce radon exposure in this case. The paper presents the long-term experiences with the two-stage measurement methodology for investigation of radon levels in school and preschool facilities and its possible improvements. (authors)

  4. Radionuclide measurements, via different methodologies, as tool for geophysical studies on Mt. Etna

    Energy Technology Data Exchange (ETDEWEB)

    Morelli, D., E-mail: daniela.morelli@ct.infn.it [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Istituto Nazionale di Fisica Nucleare- Sezione di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Imme, G. [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Istituto Nazionale di Fisica Nucleare- Sezione di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Altamore, I.; Cammisa, S. [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Giammanco, S. [Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania, Piazza Roma, 2, I-95123 Catania (Italy); La Delfa, S. [Dipartimento di Scienze Geologiche, Universita di Catania, Corso Italia,57 I-95127 Catania (Italy); Mangano, G. [Dipartimento di Fisica e Astronomia, Universita di Catania, via S. Sofia, 64 I-95123 Catania (Italy); Neri, M. [Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania, Piazza Roma, 2, I-95123 Catania (Italy); Patane, G. [Dipartimento di Scienze Geologiche, Universita di Catania, Corso Italia,57 I-95127 Catania (Italy)

    2011-10-01

    Natural radioactivity measurements represent an interesting tool to study geodynamical events or soil geophysical characteristics. In this direction we carried out, in the last years, several radionuclide monitoring both in the volcanic and tectonic areas of the oriental Sicily. In particular we report in-soil Radon investigations, in a tectonic area, including both laboratory and in-site measurements, applying three different methodologies, based on both active and passive detection systems. The active detection devices consisted of solid-state silicon detectors equipped in portable systems for short-time measurements and for long-time monitoring. The passive technique consisted of solid-state nuclear track detectors (SSNTD), CR-39 type, and allowed integrated measurements. The performances of the three methodologies were compared according to different kinds of monitoring. In general the results obtained with the three methodologies seem in agreement with each other and reflect the tectonic settings of the investigated area.

  5. Radionuclide measurements, via different methodologies, as tool for geophysical studies on Mt. Etna

    International Nuclear Information System (INIS)

    Morelli, D.; Imme, G.; Altamore, I.; Cammisa, S.; Giammanco, S.; La Delfa, S.; Mangano, G.; Neri, M.; Patane, G.

    2011-01-01

    Natural radioactivity measurements represent an interesting tool to study geodynamical events or soil geophysical characteristics. In this direction we carried out, in the last years, several radionuclide monitoring both in the volcanic and tectonic areas of the oriental Sicily. In particular we report in-soil Radon investigations, in a tectonic area, including both laboratory and in-site measurements, applying three different methodologies, based on both active and passive detection systems. The active detection devices consisted of solid-state silicon detectors equipped in portable systems for short-time measurements and for long-time monitoring. The passive technique consisted of solid-state nuclear track detectors (SSNTD), CR-39 type, and allowed integrated measurements. The performances of the three methodologies were compared according to different kinds of monitoring. In general the results obtained with the three methodologies seem in agreement with each other and reflect the tectonic settings of the investigated area.

  6. Response Surface Methodology: An Extensive Potential to Optimize in vivo Photodynamic Therapy Conditions

    International Nuclear Information System (INIS)

    Tirand, Loraine; Bastogne, Thierry; Bechet, Denise M.Sc.; Linder, Michel; Thomas, Noemie; Frochot, Celine; Guillemin, Francois; Barberi-Heyob, Muriel

    2009-01-01

    Purpose: Photodynamic therapy (PDT) is based on the interaction of a photosensitizing (PS) agent, light, and oxygen. Few new PS agents are being developed to the in vivo stage, partly because of the difficulty in finding the right treatment conditions. Response surface methodology, an empirical modeling approach based on data resulting from a set of designed experiments, was suggested as a rational solution with which to select in vivo PDT conditions by using a new peptide-conjugated PS targeting agent, neuropilin-1. Methods and Materials: A Doehlert experimental design was selected to model effects and interactions of the PS dose, fluence, and fluence rate on the growth of U87 human malignant glioma cell xenografts in nude mice, using a fixed drug-light interval. All experimental results were computed by Nemrod-W software and Matlab. Results: Intrinsic diameter growth rate, a tumor growth parameter independent of the initial volume of the tumor, was selected as the response variable and was compared to tumor growth delay and relative tumor volumes. With only 13 experimental conditions tested, an optimal PDT condition was selected (PS agent dose, 2.80 mg/kg; fluence, 120 J/cm 2 ; fluence rate, 85 mW/cm 2 ). Treatment of glioma-bearing mice with the peptide-conjugated PS agent, followed by the optimized PDT condition showed a statistically significant improvement in delaying tumor growth compared with animals who received the PDT with the nonconjugated PS agent. Conclusions: Response surface methodology appears to be a useful experimental approach for rapid testing of different treatment conditions and determination of optimal values of PDT factors for any PS agent.

  7. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  8. Measurement of the porosity of amorphous materials by gamma ray transmission methodology

    International Nuclear Information System (INIS)

    Pottker, Walmir Eno; Appoloni, Carlos Roberto

    2000-01-01

    In this work it is presented the measurement of the total porosity of TRe soil, Sandstone Berea rocks and porous ceramics samples. For the determination of the total porosity, the Arquimedes method (conventional) and the gamma ray transmission methodology were employed. The porosity measurement using the gamma methodology has a significant advantage respect to the conventional method due to the fast and non-destructive determination, and also for supplying results with a greater characterization in small scales, in relation to the heterogeneity of the porosity. The conventional methodology presents good results only for homogeneous samples. The experimental set up for the gamma ray transmission technique consisted of a 241 Am source (59,53 keV ), a NaI(Tl) scintillation detector, collimators, a XYZ micrometric table and standard gamma spectrometry electronics connected to a multichannel analyser. (author)

  9. Optimization of deposition conditions of CdS thin films using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Yücel, Ersin, E-mail: dr.ersinyucel@gmail.com [Department of Physics, Faculty of Arts and Sciences, Mustafa Kemal University, 31034 Hatay (Turkey); Güler, Nuray [Department of Physics, Faculty of Arts and Sciences, Mustafa Kemal University, 31034 Hatay (Turkey); Yücel, Yasin [Department of Chemistry, Faculty of Arts and Sciences, Mustafa Kemal University, 31034 Hatay (Turkey)

    2014-03-15

    Highlights: • Statistical methods used for optimization of CdS deposition parameters. • The morphology of the films was smooth, homogeneous and continuous. • Optimal conditions found as pH 11, stirring speed:361 rpm and deposition time: 55 min. • CdS thin film band gap value was 2.72 eV under the optimum conditions. -- Abstract: Cadmium sulfide (CdS) thin films were prepared on glass substrates by chemical bath deposition (CBD) technique under different pH, stirring speed and deposition time. Response Surface Methodology (RSM) and Central Composite Design (CCD) were used to optimization of deposition parameters of the CdS thin films. RSM and CCD were also used to understand the significance and interaction of the factors affecting the film quality. Variables were determined as pH, stirring speed and deposition time. The band gap was chosen as response in the study. Influences of the variables on the band gap and the film quality were investigated. 5-level-3-factor central composite design was employed to evaluate the effects of the deposition conditions parameters such as pH (10.2–11.8), stirring speed (132–468 rpm) and deposition time (33–67 min) on the band gap of the films. The samples were characterized using X-ray diffraction (XRD), scanning electron microscope (SEM) and ultraviolet–visible spectroscopy (UV–vis) measurements. The optimal conditions for the deposition parameters of the CdS thin films have been found to be: pH 11, 361 of stirring speed and 55 min of deposition time. Under the optimal conditions theoretical (predicted) band gap of CdS (2.66 eV) was calculated using optimal coded values from the model and the theoretical value is good agreement with the value (2.72 eV) obtained by verification experiment.

  10. High-frequency measurements of aeolian saltation flux: Field-based methodology and applications

    Science.gov (United States)

    Martin, Raleigh L.; Kok, Jasper F.; Hugenholtz, Chris H.; Barchyn, Thomas E.; Chamecki, Marcelo; Ellis, Jean T.

    2018-02-01

    Aeolian transport of sand and dust is driven by turbulent winds that fluctuate over a broad range of temporal and spatial scales. However, commonly used aeolian transport models do not explicitly account for such fluctuations, likely contributing to substantial discrepancies between models and measurements. Underlying this problem is the absence of accurate sand flux measurements at the short time scales at which wind speed fluctuates. Here, we draw on extensive field measurements of aeolian saltation to develop a methodology for generating high-frequency (up to 25 Hz) time series of total (vertically-integrated) saltation flux, namely by calibrating high-frequency (HF) particle counts to low-frequency (LF) flux measurements. The methodology follows four steps: (1) fit exponential curves to vertical profiles of saltation flux from LF saltation traps, (2) determine empirical calibration factors through comparison of LF exponential fits to HF number counts over concurrent time intervals, (3) apply these calibration factors to subsamples of the saltation count time series to obtain HF height-specific saltation fluxes, and (4) aggregate the calibrated HF height-specific saltation fluxes into estimates of total saltation fluxes. When coupled to high-frequency measurements of wind velocity, this methodology offers new opportunities for understanding how aeolian saltation dynamics respond to variability in driving winds over time scales from tens of milliseconds to days.

  11. Theoretical and methodological reasoning of correction technologies of the physical conditions of students of music speciality

    Directory of Open Access Journals (Sweden)

    Petro Marynchuk

    2017-08-01

    Full Text Available The article emphasizes the lack of development of the methodological basis for the physical education of students of Music Arts. Professionally dependent indicators of physical condition were taken into account. The article also outlines the main theoretical and methodological provisions that underlie the development of technology for correction of the physical condition of students of music arts. They are in particular actualization of life-giving motivation of students to increase the level of physical condition, regular physical exercises, the need for the development of professionally important physical qualities, ensuring the differentiation of physical activity, taking into account the level of physical state and physical conditions of students of Music Arts. The structure of the technology of correction of the physical condition of students of Music Arts is considered. The technology contains the purpose, tasks, principles, stages of implementation, the program with the use of physical culture, performance criteria. The main stages of the technology implementation – preparatory, main, final – are analyzed. The means of motor activity of innovative direction are described for use in the practice of higher educational institutions, which take into account the features of the student staff, their mode of educational activity.

  12. Methodologies for measuring travelers' risk perception of infectious diseases: A systematic review.

    Science.gov (United States)

    Sridhar, Shruti; Régner, Isabelle; Brouqui, Philippe; Gautret, Philippe

    2016-01-01

    Numerous studies in the past have stressed the importance of travelers' psychology and perception in the implementation of preventive measures. The aim of this systematic review was to identify the methodologies used in studies reporting on travelers' risk perception of infectious diseases. A systematic search for relevant literature was conducted according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. There were 39 studies identified. In 35 of 39 studies, the methodology used was that of a knowledge, attitude and practice (KAP) survey based on questionnaires. One study used a combination of questionnaires and a visual psychometric measuring instrument called the 'pictorial representation of illness and self-measurement" or PRISM. One study used a self-representation model (SRM) method. Two studies measured psychosocial factors. Valuable information was obtained from KAP surveys showing an overall lack of knowledge among travelers about the most frequent travel-associated infections and associated preventive measures. This methodological approach however, is mainly descriptive, addressing knowledge, attitudes, and practices separately and lacking an examination of the interrelationships between these three components. Another limitation of the KAP method is underestimating psychosocial variables that have proved influential in health related behaviors, including perceived benefits and costs of preventive measures, perceived social pressure, perceived personal control, unrealistic optimism and risk propensity. Future risk perception studies in travel medicine should consider psychosocial variables with inferential and multivariate statistical analyses. The use of implicit measurements of attitudes could also provide new insights in the field of travelers' risk perception of travel-associated infectious diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Measure a carbon impact methodology in line with a 2 degree scenario

    International Nuclear Information System (INIS)

    Coeslier, Manuel; Finidori, Esther; Smia, Ladislas

    2015-11-01

    Today, high expectations surround the measurement of carbon impact. Voluntary initiatives and - little by little - legislation push institutional investors to consider the impact that financial portfolios have on the climate and energy transition. However, current methods (of carbon footprint measurement) are not adequate to determine an investment portfolio's contribution to these issues. Current approaches, which do not take a life-cycle vision of carbon foot-printing, have the particular flaw of not accounting for emissions related to companies' products and services. The impact of these products and services on the climate is, however, crucial in many sectors - whether positively in the case of renewable energy and energy efficiency solutions, or negatively in the case of fossil fuels. Following this observation, Mirova and Carbone 4 decided to create a partnership dedicated to developing a new methodology capable of providing a carbon measurement that is aligned with the issues of energy transition: Carbon Impact Analytics (CIA). The CIA methodology focuses primarily on three indicators: - A measure of emissions 'induced' by a company's activity from a life-cycle approach, taking into account direct emissions as well as emissions from product suppliers; - A measure of the emissions which are 'avoided' due to efficiency efforts or deployment of 'low-carbon' solutions; - An overall evaluation that takes into account, in addition to carbon measurement, further information on the company's evolution and the type of capital or R and D expenditures. For these evaluations, the methodology employs a bottom-up approach in which each company is examined individually according to an evaluation framework adapted to each sector. Particular scrutiny is devoted to companies with a significant climate impact: energy producers, carbon-intensive sectors (industry, construction, transport), and providers of low-carbon equipment and solutions. Evaluations are then aggregated at

  14. A Methodology for Measuring Microplastic Transport in Large or Medium Rivers

    Directory of Open Access Journals (Sweden)

    Marcel Liedermann

    2018-04-01

    Full Text Available Plastic waste as a persistent contaminant of our environment is a matter of increasing concern due to the largely unknown long-term effects on biota. Although freshwater systems are known to be the transport paths of plastic debris to the ocean, most research has been focused on marine environments. In recent years, freshwater studies have advanced rapidly, but they rarely address the spatial distribution of plastic debris in the water column. A methodology for measuring microplastic transport at various depths that is applicable to medium and large rivers is needed. We present a new methodology offering the possibility of measuring microplastic transport at different depths of verticals that are distributed within a profile. The net-based device is robust and can be applied at high flow velocities and discharges. Nets with different sizes (41 µm, 250 µm, and 500 µm are exposed in three different depths of the water column. The methodology was tested in the Austrian Danube River, showing a high heterogeneity of microplastic concentrations within one cross section. Due to turbulent mixing, the different densities of the polymers, aggregation, and the growth of biofilms, plastic transport cannot be limited to the surface layer of a river, and must be examined within the whole water column as for suspended sediments. These results imply that multipoint measurements are required for obtaining the spatial distribution of plastic concentration and are therefore a prerequisite for calculating the passing transport. The analysis of filtration efficiency and side-by-side measurements with different mesh sizes showed that 500 µm nets led to optimal results.

  15. Assessing Long-Term Wind Conditions by Combining Different Measure-Correlate-Predict Algorithms: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, J.; Chowdhury, S.; Messac, A.; Hodge, B. M.

    2013-08-01

    This paper significantly advances the hybrid measure-correlate-predict (MCP) methodology, enabling it to account for variations of both wind speed and direction. The advanced hybrid MCP method uses the recorded data of multiple reference stations to estimate the long-term wind condition at a target wind plant site. The results show that the accuracy of the hybrid MCP method is highly sensitive to the combination of the individual MCP algorithms and reference stations. It was also found that the best combination of MCP algorithms varies based on the length of the correlation period.

  16. Non-pharmacological sleep interventions for youth with chronic health conditions: a critical review of the methodological quality of the evidence.

    Science.gov (United States)

    Brown, Cary A; Kuo, Melissa; Phillips, Leah; Berry, Robyn; Tan, Maria

    2013-07-01

    Restorative sleep is clearly linked with well-being in youth with chronic health conditions. This review addresses the methodological quality of non-pharmacological sleep intervention (NPSI) research for youth with chronic health conditions. The Guidelines for Critical Review (GCR) and the Effective Public Health Practice Project Quality Assessment Tool (EPHPP) were used in the review. The search yielded 31 behavioural and 10 non-behavioural NPSI for review. Most studies had less than 10 participants. Autism spectrum disorders, attention deficit/hyperactivity disorders, down syndrome, intellectual disabilities, and visual impairments were the conditions that most studies focused upon. The global EPHPP scores indicated most reviewed studies were of weak quality. Only 7 studies were rated as moderate, none were strong. Studies rated as weak quality frequently had recruitment issues; non-blinded participants/parents and/or researchers; and used outcome measures without sound psychometric properties. Little conclusive evidence exists for NPSIs in this population. However, NPSIs are widely used and these preliminary studies demonstrate promising outcomes. There have not been any published reports of negative outcomes that would preclude application of the different NPSIs on a case-by-case basis guided by clinical judgement. These findings support the need for more rigorous, applied research. • Methodological Quality of Sleep Research • Disordered sleep (DS) in youth with chronic health conditions is pervasive and is important to rehabilitation therapists because DS contributes to significant functional problems across psychological, physical and emotional domains. • Rehabilitation therapists and other healthcare providers receive little education about disordered sleep and are largely unaware of the range of assessment and non-pharmacological intervention strategies that exist. An evidence-based website of pediatric sleep resources can be found at http

  17. Methodology to measure strains at high temperatures using electrical strain gages with free filaments

    International Nuclear Information System (INIS)

    Atanazio Filho, Nelson N.; Gomes, Paulo T. Vida; Scaldaferri, Denis H.B.; Silva, Luiz L. da; Rabello, Emerson G.; Mansur, Tanius R.

    2013-01-01

    An experimental methodology used for strains measuring at high temperatures is show in this work. In order to do the measurements, it was used electric strain gages with loose filaments attached to a stainless steel 304 beam with specific cements. The beam has triangular shape and a constant thickness, so the strain is the same along its length. Unless the beam surface be carefully prepared, the strain gage attachment is not efficient. The showed results are for temperatures ranging from 20 deg C to 300 deg C, but the experimental methodology could be used to measure strains at a temperature up to 900 deg C. Analytical calculations based on solid mechanics were used to verify the strain gage electrical installation and the measured strains. At a first moment, beam deformations as a temperature function were plotted. After that, beam deformations with different weighs were plotted as a temperature function. The results shown allowed concluding that the experimental methodology is trustable to measure strains at temperatures up to 300 deg C. (author)

  18. Measuring the Quality of Publications : New Methodology and Case Study

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; van Groenendaal, W.J.H.

    2000-01-01

    n practice, it is important to evaluate the quality of research, in order to make decisions on tenure, funding, and so on. This article develops a methodology using citations to measure the quality of journals, proceedings, and book publishers. (Citations are also used by the Science and Social

  19. A methodology for hard/soft information fusion in the condition monitoring of aircraft

    Science.gov (United States)

    Bernardo, Joseph T.

    2013-05-01

    Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.

  20. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    International Nuclear Information System (INIS)

    Tarifeño-Saldivia, Ariel; Pavez, Cristian; Soto, Leopoldo; Mayer, Roberto E

    2015-01-01

    This work introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from detection of the burst of neutrons. An improvement of more than one order of magnitude in the accuracy of a paraffin wax moderated 3 He-filled tube is obtained by using this methodology with respect to previous calibration methods. (paper)

  1. Determination of Radiological, Material and Organizational Measures for Reuse of Conditionally Released Materials from Decommissioning

    International Nuclear Information System (INIS)

    Ondra, F.; Vasko, M.; Necas, V.

    2012-01-01

    An important part of nuclear installation decommissioning is conditional release of materials. The mass of conditionally released materials can significantly influence radioactive waste management and capacity of radioactive waste repository. The influence on a total decommissioning cost is also not negligible. Several scenarios for reuse of conditionally released materials were developed within CONRELMAT project. Each scenario contains preparation phase, construction phase and operation phase. For each above mentioned phase is needed to determine radiological, material, organizational and other constraints for conditionally released materials reuse to not break exposure limits for staff and public. Constraints are determined on the basis of external and internal exposure calculations in created models for selected takes in particular scenarios phases. The paper presents a developed methodology for determination of part of above mentioned constraints concerning external exposure of staff or public. Values of staff external exposure are also presented in paper to ensure that staff or public exposure does not break the limits. The methodology comprises a proposal of following constraints: radionuclide limit concentration of conditionally released materials for specific scenarios and nuclide vectors, specific deployment of conditionally released materials eventually shielding materials, staff and public during the scenario's phases, organizational measures concerning time of staff's or public's stay in the vicinity of conditionally released materials for individual performed scenarios and nuclide vectors. The paper further describes VISIPLAN 3D ALARA calculation planning software tool used for calculation of staff's and public's external exposure for individual scenarios. Several other parallel papers proposed for HND2012 are presenting selected details of the project.(author).

  2. Relative Hazard and Risk Measure Calculation Methodology

    International Nuclear Information System (INIS)

    Stenner, Robert D.; Strenge, Dennis L.; Elder, Matthew S.; Andrews, William B.; Walton, Terry L.

    2003-01-01

    The RHRM equations, as represented in methodology and code presented in this report, are primarily a collection of key factors normally used in risk assessment that are relevant to understanding the hazards and risks associated with projected mitigation, cleanup, and risk management activities. The RHRM code has broad application potential. For example, it can be used to compare one mitigation, cleanup, or risk management activity with another, instead of just comparing it to just the fixed baseline. If the appropriate source term data are available, it can be used in its non-ratio form to estimate absolute values of the associated controlling hazards and risks. These estimated values of controlling hazards and risks can then be examined to help understand which mitigation, cleanup, or risk management activities are addressing the higher hazard conditions and risk reduction potential at a site. Graphics can be generated from these absolute controlling hazard and risk values to graphically compare these high hazard and risk reduction potential conditions. If the RHRM code is used in this manner, care must be taken to specifically define and qualify (e.g., identify which factors were considered and which ones tended to drive the hazard and risk estimates) the resultant absolute controlling hazard and risk values

  3. Covariance methodology applied to uncertainties in I-126 disintegration rate measurements

    International Nuclear Information System (INIS)

    Fonseca, K.A.; Koskinas, M.F.; Dias, M.S.

    1996-01-01

    The covariance methodology applied to uncertainties in 126 I disintegration rate measurements is described. Two different coincidence systems were used due to the complex decay scheme of this radionuclide. The parameters involved in the determination of the disintegration rate in each experimental system present correlated components. In this case, the conventional statistical methods to determine the uncertainties (law of propagation) result in wrong values for the final uncertainty. Therefore, use of the methodology of the covariance matrix is necessary. The data from both systems were combined taking into account all possible correlations between the partial uncertainties. (orig.)

  4. Defining Multiple Chronic Conditions for Quality Measurement.

    Science.gov (United States)

    Drye, Elizabeth E; Altaf, Faseeha K; Lipska, Kasia J; Spatz, Erica S; Montague, Julia A; Bao, Haikun; Parzynski, Craig S; Ross, Joseph S; Bernheim, Susannah M; Krumholz, Harlan M; Lin, Zhenqiu

    2018-02-01

    Patients with multiple chronic conditions (MCCs) are a critical but undefined group for quality measurement. We present a generally applicable systematic approach to defining an MCC cohort of Medicare fee-for-service beneficiaries that we developed for a national quality measure, risk-standardized rates of unplanned admissions for Accountable Care Organizations. To define the MCC cohort we: (1) identified potential chronic conditions; (2) set criteria for cohort conditions based on MCC framework and measure concept; (3) applied the criteria informed by empirical analysis, experts, and the public; (4) described "broader" and "narrower" cohorts; and (5) selected final cohort with stakeholder input. Subjects were patients with chronic conditions. Participants included 21.8 million Medicare fee-for-service beneficiaries in 2012 aged 65 years and above with ≥1 of 27 Medicare Chronic Condition Warehouse condition(s). In total, 10 chronic conditions were identified based on our criteria; 8 of these 10 were associated with notably increased admission risk when co-occurring. A broader cohort (2+ of the 8 conditions) included 4.9 million beneficiaries (23% of total cohort) with an admission rate of 70 per 100 person-years. It captured 53% of total admissions. The narrower cohort (3+ conditions) had 2.2 million beneficiaries (10%) with 100 admissions per 100 person-years and captured 32% of admissions. Most stakeholders viewed the broader cohort as best aligned with the measure concept. By systematically narrowing chronic conditions to those most relevant to the outcome and incorporating stakeholder input, we defined an MCC admission measure cohort supported by stakeholders. This approach can be used as a model for other MCC outcome measures.

  5. Personal dosimetry service of TECNATOM: measurement system and methodology of calibration

    International Nuclear Information System (INIS)

    Marchena, Paloma; Bravo, Borja

    2008-01-01

    Full text: The implementation of a new integrated and practical working tool called ALEDIN within the Personal Dosimetry Service (PDS) of TECNATOM, have harmonized the methodology for the counting acquisition, detector calibration and data analysis using a friendly Windows (registered mark) environment. The knowledge of this methodology, due to the fact that is the final product of a R and D project, will help the users and the Regulatory Body for a better understanding of the internal activity measurement in individuals, allowing a more precise error identification and correction, and improving the whole process of the internal dosimetry. The development and implementation of a new calibration system of the whole body counters using NaI (Tl) detectors and the utilization of a new humanoid anthropometric phantom, BOMAB type, with a uniform radioactive source distributions, allow a better energy and activity calibration for different counting geometries covering a wide range of gamma spectra from low energies, less than 100 keV to about 2000 keV for the high energies spectra. This new calibration methodology implied the development of an improved system for the determination of the isotopic activity. This new system has been integrated in a Windows (registered mark) environment, applicable for counting acquisition and data analysis in the whole body counters WBC in cross connection with the INDAC software, which allow the interpretation of the measured activity as committed effective dose following all the new ICRP recommendations and dosimetric models for internal dose and bioassay measurements. (author)

  6. Thermotactile perception thresholds measurement conditions.

    Science.gov (United States)

    Maeda, Setsuo; Sakakibara, Hisataka

    2002-10-01

    The purpose of this paper is to investigate the effects of posture, push force and rate of temperature change on thermotactile thresholds and to clarify suitable measuring conditions for Japanese people. Thermotactile (warm and cold) thresholds on the right middle finger were measured with an HVLab thermal aesthesiometer. Subjects were eight healthy male Japanese students. The effects of posture in measurement were examined in the posture of a straight hand and forearm placed on a support, the same posture without a support, and the fingers and hand flexed at the wrist with the elbow placed on a desk. The finger push force applied to the applicator of the thermal aesthesiometer was controlled at a 0.5, 1.0, 2.0 and 3.0 N. The applicator temperature was changed to 0.5, 1.0, 1.5, 2.0 and 2.5 degrees C/s. After each measurement, subjects were asked about comfort under the measuring conditions. Three series of experiments were conducted on different days to evaluate repeatability. Repeated measures ANOVA showed that warm thresholds were affected by the push force and the rate of temperature change and that cold thresholds were influenced by posture and push force. The comfort assessment indicated that the measurement posture of a straight hand and forearm laid on a support was the most comfortable for the subjects. Relatively high repeatability was obtained under measurement conditions of a 1 degrees C/s temperature change rate and a 0.5 N push force. Measurement posture, push force and rate of temperature change can affect the thermal threshold. Judging from the repeatability, a push force of 0.5 N and a temperature change of 1.0 degrees C/s in the posture with the straight hand and forearm laid on a support are recommended for warm and cold threshold measurements.

  7. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  8. Development of a field measurement methodology for studying the thermal indoor environment in hybrid GEOTABS buildings

    DEFF Research Database (Denmark)

    Kazanci, Ongun Berk; Khovalyg, Dolaana; Olesen, Bjarne W.

    2018-01-01

    buildings. The three demonstration buildings were an office building in Luxembourg, an elderly care home in Belgium, and an elementary school in Czech Republic. All of these buildings are equipped with hybrid GEOTABS systems; however, they vary in size and function, which requires a unique measurement...... methodology for studying them. These buildings already have advanced Building Management Systems (BMS); however, a more detailed measurement plan was needed for the purposes of the project to document the current performance of these systems regarding thermal indoor environment and energy performance......, and to be able to document the improvements after the implementation of the MPC. This study provides the details of the developed field measurement methodology for each of these buildings to study the indoor environmental quality (IEQ) in details. The developed measurement methodology can be applied to other...

  9. Optimization extraction conditions for improving phenolic content and antioxidant activity in Berberis asiatica fruits using response surface methodology (RSM).

    Science.gov (United States)

    Belwal, Tarun; Dhyani, Praveen; Bhatt, Indra D; Rawal, Ranbeer Singh; Pande, Veena

    2016-09-15

    This study for the first time designed to optimize the extraction of phenolic compounds and antioxidant potential of Berberis asiatica fruits using response surface methodology (RSM). Solvent selection was done based on the preliminary experiments and a five-factors-three-level, Central Composite Design (CCD). Extraction temperature (X1), sample to solvent ratio (X3) and solvent concentration (X5) significantly affect response variables. The quadratic model well fitted for all the responses. Under optimal extraction conditions, the dried fruit sample mixed with 80% methanol having 3.0 pH in a ratio of 1:50 and the mixture was heated at 80 °C for 30 min; the measured parameters was found in accordance with the predicted values. High Performance Liquid Chromatography (HPLC) analysis at optimized condition reveals 6 phenolic compounds. The results suggest that optimization of the extraction conditions is critical for accurate quantification of phenolics and antioxidants in Berberis asiatica fruits, which may further be utilized for industrial extraction procedure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. 42 CFR 486.318 - Condition: Outcome measures.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Outcome measures. 486.318 Section 486... Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a..., territories, or possessions, an OPO must meet all 3 of the following outcome measures: (1) The OPO's donation...

  11. Does Methodological Guidance Produce Consistency? A Review of Methodological Consistency in Breast Cancer Utility Value Measurement in NICE Single Technology Appraisals.

    Science.gov (United States)

    Rose, Micah; Rice, Stephen; Craig, Dawn

    2017-07-05

    Since 2004, National Institute for Health and Care Excellence (NICE) methodological guidance for technology appraisals has emphasised a strong preference for using the validated EuroQol 5-Dimensions (EQ-5D) quality-of-life instrument, measuring patient health status from patients or carers, and using the general public's preference-based valuation of different health states when assessing health benefits in economic evaluations. The aim of this study was to review all NICE single technology appraisals (STAs) for breast cancer treatments to explore consistency in the use of utility scores in light of NICE methodological guidance. A review of all published breast cancer STAs was undertaken using all publicly available STA documents for each included assessment. Utility scores were assessed for consistency with NICE-preferred methods and original data sources. Furthermore, academic assessment group work undertaken during the STA process was examined to evaluate the emphasis of NICE-preferred quality-of-life measurement methods. Twelve breast cancer STAs were identified, and many STAs used evidence that did not follow NICE's preferred utility score measurement methods. Recent STA submissions show companies using EQ-5D and mapping. Academic assessment groups rarely emphasized NICE-preferred methods, and queries about preferred methods were rare. While there appears to be a trend in recent STA submissions towards following NICE methodological guidance, historically STA guidance in breast cancer has generally not used NICE's preferred methods. Future STAs in breast cancer and reviews of older guidance should ensure that utility measurement methods are consistent with the NICE reference case to help produce consistent, equitable decision making.

  12. THE MEASUREMENT METHODOLOGY IMPROVEMENT OF THE HORIZONTAL IRREGULARITIES IN PLAN

    Directory of Open Access Journals (Sweden)

    O. M. Patlasov

    2015-08-01

    Full Text Available Purpose. Across the track superstructure (TSS there are structures where standard approach to the decision on the future of their operation is not entirely correct or acceptable. In particular, it concerns the track sections which are sufficiently quickly change their geometric parameters: the radius of curvature, angle of rotation, and the like. As an example, such portions of TSS may include crossovers where their component is within the so-called connecting part, which at a sufficiently short length, substantially changes curvature. The estimation of the position in terms of a design on the basis of the existing technique (by the difference in the adjacent arrows bending is virtually impossible. Therefore it is proposed to complement and improve the methodology for assessing the situation of the curve in plan upon difference in the adjacent versine. Methodology. The possible options for measuring horizontal curves in the plan were analyzed. The most adequate method, which does not contradict existing on the criterion of the possibility of using established standards was determined. The ease of measurement and calculation was took into account. Findings. Qualitative and quantitative verification of the proposed and existing methods showed very good agreement of the measurement results. This gives grounds to assert that this methodology can be recommended to the workers of track facilities in the assessment of horizontal irregularities in plan not only curves, but also within the connecting part of switch congresses. Originality. The existing method of valuation of the geometric position of the curves in the plan was improved. It does not create new regulations, and all results are evaluated by existing norms. Practical value. The proposed technique makes it possible, without creating a new regulatory framework, to be attached to existing one, and expanding the boundaries of its application. This method can be used not only for ordinary curves

  13. A Methodological Demonstration of Set-theoretical Approach to Social Media Maturity Models Using Necessary Condition Analysis

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    Despite being widely accepted and applied across research domains, maturity models have been criticized for lacking academic rigor, especially methodologically rigorous and empirically grounded or tested maturity models are quite rare. Attempting to close this gap, we adopt a set-theoretic approach...... and evaluate some of arguments presented by previous conceptual focused social media maturity models....... by applying the Necessary Condition Analysis (NCA) technique to derive maturity stages and stage boundaries conditions. The ontology is to view stages (boundaries) in maturity models as a collection of necessary condition. Using social media maturity data, we demonstrate the strength of our approach...

  14. Characterization of gloss properties of differently treated polymer coating surfaces by surface clarity measurement methodology.

    Science.gov (United States)

    Gruber, Dieter P; Buder-Stroisznigg, Michael; Wallner, Gernot; Strauß, Bernhard; Jandel, Lothar; Lang, Reinhold W

    2012-07-10

    With one measurement configuration, existing gloss measurement methodologies are generally restricted to specific gloss levels. A newly developed image-analytical gloss parameter called "clarity" provides the possibility to describe the perceptual result of a broad range of different gloss levels with one setup. In order to analyze and finally monitor the perceived gloss of products, a fast and flexible method also for the automated inspection is highly demanded. The clarity parameter is very fast to calculate and therefore usable for fast in-line surface inspection. Coated metal specimens were deformed by varying degree and polished afterwards in order to study the clarity parameter regarding the quantification of varying surface gloss types and levels. In order to analyze the correlation with the human gloss perception a study was carried out in which experts were asked to assess gloss properties of a series of surface samples under standardized conditions. The study confirmed clarity to exhibit considerably better correlation to the human perception than alternative gloss parameters.

  15. Methodology of environmental risk assessment management

    Directory of Open Access Journals (Sweden)

    Saša T. Bakrač

    2012-04-01

    Full Text Available Successful protection of environment is mostly based on high-quality assessment of potential and present risks. Environmental risk management is a complex process which includes: identification, assessment and control of risk, namely taking measures in order to minimize the risk to an acceptable level. Environmental risk management methodology: In addition to these phases in the management of environmental risk, appropriate measures that affect the reduction of risk occurrence should be implemented: - normative and legal regulations (laws and regulations, - appropriate organizational structures in society, and - establishing quality monitoring of environment. The emphasis is placed on the application of assessment methodologies (three-model concept, as the most important aspect of successful management of environmental risk. Risk assessment methodology - European concept: The first concept of ecological risk assessment methodology is based on the so-called European model-concept. In order to better understand this ecological risk assessment methodology, two concepts - hazard and risk - are introduced. The European concept of environmental risk assessment has the following phases in its implementation: identification of hazard (danger, identification of consequences (if there is hazard, estimate of the scale of consequences, estimate of consequence probability and risk assessment (also called risk characterization. The European concept is often used to assess risk in the environment as a model for addressing the distribution of stressors along the source - path - receptor line. Risk assessment methodology - Canadian concept: The second concept of the methodology of environmental risk assessment is based on the so-called Canadian model-concept. The assessment of ecological risk includes risk arising from natural events (floods, extreme weather conditions, etc., technological processes and products, agents (chemical, biological, radiological, etc

  16. Two methodologies for optical analysis of contaminated engine lubricants

    International Nuclear Information System (INIS)

    Aghayan, Hamid; Yang, Jun; Bordatchev, Evgueni

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant–object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  17. Telemetric measurement system of beehive environment conditions

    Science.gov (United States)

    Walendziuk, Wojciech; Sawicki, Aleksander

    2014-11-01

    This work presents a measurement system of beehive environmental conditions. The purpose of the device is to perform measurements of parameters such as ambient temperature, atmospheric pressure, internal temperature, humidity and sound level. The measured values were transferred to the MySQL database, which is located on an external server, with the use of GPRS protocol. A website presents the measurement data in the form of tables and graphs. The study also shows exemplary results of environmental conditions measurements recorded in the beehive by hour cycle.

  18. A methodology for on-line calculation of temperature and thermal stress under non-linear boundary conditions

    International Nuclear Information System (INIS)

    Botto, D.; Zucca, S.; Gola, M.M.

    2003-01-01

    In the literature many works have been written dealing with the task of on-line calculation of temperature and thermal stress for machine components and structures, in order to evaluate fatigue damage accumulation and estimate residual life. One of the most widespread methodologies is the Green's function technique (GFT), by which machine parameters such as fluid temperatures, pressures and flow rates are converted into metal temperature transients and thermal stresses. However, since the GFT is based upon the linear superposition principle, it cannot be directly used in the case of varying heat transfer coefficients. In the present work, a different methodology is proposed, based upon CMS for temperature transient calculation and upon the GFT for the related thermal stress evaluation. This new approach allows variable heat transfer coefficients to be accounted for. The methodology is applied for two different case studies, taken from the literature: a thick pipe and a nozzle connected to a spherical head, both subjected to multiple convective boundary conditions

  19. Optimization on Preparation Condition of Propolis Flavonoids Liposome by Response Surface Methodology and Research of Its Immunoenhancement Activity

    Directory of Open Access Journals (Sweden)

    Ju Yuan

    2013-01-01

    Full Text Available The aim of this study is to prepare propolis flavonoids liposome (PFL and optimize the preparation condition and to investigate further whether liposome could promote the immunoenhancement activity of propolis flavonoids (PF. PFL was prepared with ethanol injection method, and the preparation conditions of PFL were optimized with response surface methodology (RSM. Moreover, the immunoenhancement activity of PFL and PF in vitro was determined. The result showed that the optimal preparation conditions for PFL by response surface methodology were as follows: ratio of lipid to drug (w/w 9.6 : 1, ratio of soybean phospholipid to cholesterol (w/w 8.5 : 1, and speed of injection 0.8 mL·min−1. Under these conditions, the experimental encapsulation efficiency of PFL was 91.67 ± 0.21%, which was close to the predicted value. Therefore, the optimized preparation condition is very reliable. Moreover, the results indicated that PFL could not only significantly promote lymphocytes proliferation singly or synergistically with PHA, but also increase expression level of IL-2 and IFN-γ mRNA. These indicated that liposome could significantly improve the immunoenhancement activity of PF. PFL demonstrates the significant immunoenhancement activity, which provides the theoretical basis for the further experiment in vivo.

  20. Indoor radon measurements and methodologies in Latin American countries

    International Nuclear Information System (INIS)

    Canoba, A.; Lopez, F.O.; Arnaud, M.I.; Oliveira, A.A.; Neman, R.S.; Hadler, J.C.; Iunes, P.J.; Paulo, S.R.; Osorio, A.M.; Aparecido, R.; Rodriguez, C.; Moreno, V.; Vasquez, R.; Espinosa, G.; Golzarri, J.I.; Martinez, T.; Navarrete, M.; Cabrera, I.; Segovia, N.; Pena, P.; Tamez, E.; Pereyra, P.; Lopez-Herrera, M.E.; Sajo-Bohus, L.

    2001-01-01

    According to the current international guidelines concerning environmental problems, it is necessary to evaluate and to know the indoor radon levels, specially since most of the natural radiation dose to man comes from radon gas and its progeny. Several countries have established National Institutions and National Programs for the study of radon and its connection with lung cancer risk and public health. The aim of this work is to present the indoor radon measurements and the detection methods used for different regions of Latin America (LA) in countries such as Argentina, Brazil, Ecuador, Mexico, Peru and Venezuela. This study shows that the passive radon devices based on alpha particle nuclear track methodology (NTM) is one of the more generalized methods in LA for long term indoor radon measurements, CR-39, LR-115 and Makrofol being the more commonly used detector materials. The participating institutions and the radon level measurements in the different countries are presented in this contribution

  1. Methodology of clinical measures of healthcare quality delivered to patients with cardiovascular diseases

    Directory of Open Access Journals (Sweden)

    Posnenkova O.M.

    2014-03-01

    Full Text Available The results of implementation the methodology proposed by American Colleague of Cardiology and American Heart Association (ACC/AHA for development of Russian clinical quality measures for patients with arterial hypertension, coronary heart disease and chronic heart failure. Created quality measures cover the key elements of medical care influencing directly on clinical outcomes of treatment.

  2. ANL calculational methodologies for determining spent nuclear fuel source term

    International Nuclear Information System (INIS)

    McKnight, R. D.

    2000-01-01

    Over the last decade Argonne National Laboratory has developed reactor depletion methods and models to determine radionuclide inventories of irradiated EBR-II fuels. Predicted masses based on these calculational methodologies have been validated using available data from destructive measurements--first from measurements of lead EBR-II experimental test assemblies and later using data obtained from processing irradiated EBR-II fuel assemblies in the Fuel Conditioning Facility. Details of these generic methodologies are described herein. Validation results demonstrate these methods meet the FCF operations and material control and accountancy requirements

  3. Methodological Comparison between a Novel Automatic Sampling System for Gas Chromatography versus Photoacoustic Spectroscopy for Measuring Greenhouse Gas Emissions under Field Conditions

    Directory of Open Access Journals (Sweden)

    Alexander J. Schmithausen

    2016-10-01

    Full Text Available Trace gases such as nitrous oxide (N2O, methane (CH4, and carbon dioxide (CO2 are climate-related gases, and their emissions from agricultural livestock barns are not negligible. Conventional measurement systems in the field (Fourier transform infrared spectroscopy (FTIR; photoacoustic system (PAS are not sufficiently sensitive to N2O. Laser-based measurement systems are highly accurate, but they are very expensive to purchase and maintain. One cost-effective alternative is gas chromatography (GC with electron capture detection (ECD, but this is not suitable for field applications due to radiation. Measuring samples collected automatically under field conditions in the laboratory at a subsequent time presents many challenges. This study presents a sampling designed to promote laboratory analysis of N2O concentrations sampled under field conditions. Analyses were carried out using PAS in the field (online system and GC in the laboratory (offline system. Both measurement systems showed a good correlation for CH4 and CO2 concentrations. Measured N2O concentrations were near the detection limit for PAS. GC achieved more reliable results for N2O in very low concentration ranges.

  4. Relative Hazard and Risk Measure Calculation Methodology Rev 1

    International Nuclear Information System (INIS)

    Stenner, Robert D.; White, Michael K.; Strenge, Dennis L.; Aaberg, Rosanne L.; Andrews, William B.

    2000-01-01

    Documentation of the methodology used to calculate relative hazard and risk measure results for the DOE complex wide risk profiles. This methodology is used on major site risk profiles. In February 1997, the Center for Risk Excellence (CRE) was created and charged as a technical, field-based partner to the Office of Science and Risk Policy (EM-52). One of the initial charges to the CRE is to assist the sites in the development of ''site risk profiles.'' These profiles are to be relatively short summaries (periodically updated) that present a broad perspective on the major risk related challenges that face the respective site. The risk profiles are intended to serve as a high-level communication tool for interested internal and external parties to enhance the understanding of these risk-related challenges. The risk profiles for each site have been designed to qualitatively present the following information: (1) a brief overview of the site, (2) a brief discussion on the historical mission of the site, (3) a quote from the site manager indicating the site's commitment to risk management, (4) a listing of the site's top risk-related challenges, (5) a brief discussion and detailed table presenting the site's current risk picture, (6) a brief discussion and detailed table presenting the site's future risk reduction picture, and (7) graphic illustrations of the projected management of the relative hazards at the site. The graphic illustrations were included to provide the reader of the risk profiles with a high-level mental picture to associate with all the qualitative information presented in the risk profile. Inclusion of these graphic illustrations presented the CRE with the challenge of how to fold this high-level qualitative risk information into a system to produce a numeric result that would depict the relative change in hazard, associated with each major risk management action, so it could be presented graphically. This report presents the methodology developed

  5. Methodological possibilities for using the electron and ion energy balance in thermospheric complex measurements

    International Nuclear Information System (INIS)

    Serafimov, K.B.; Serafimova, M.K.

    1991-01-01

    Combination of ground based measurements for determination of basic thermospheric characteristics is proposed . An expression for the energy transport between components of space plasma is also derived and discussed within the framework of the presented methodology which could be devided into the folowing major sections: 1) application of ionosonde, absorption measurements, TEC-measurements using Faradey radiation or the differential Doppler effect; 2) ground-based airglow measurements; 3) airglow and palsma satelite measurements. 9 refs

  6. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    Directory of Open Access Journals (Sweden)

    Johnston Marie

    2007-03-01

    Full Text Available Abstract Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model? and ii methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?. Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological

  7. A combined linear optimisation methodology for water resources allocation in Alfeios River Basin (Greece) under uncertain and vague system conditions

    Science.gov (United States)

    Bekri, Eleni; Yannopoulos, Panayotis; Disse, Markus

    2013-04-01

    In the present study, a combined linear programming methodology, based on Li et al. (2010) and Bekri et al. (2012), is employed for optimizing water allocation under uncertain system conditions in the Alfeios River Basin, in Greece. The Alfeios River is a water resources system of great natural, ecological, social and economic importance for Western Greece, since it has the longest and highest flow rate watercourse in the Peloponnisos region. Moreover, the river basin was exposed in the last decades to a plethora of environmental stresses (e.g. hydrogeological alterations, intensively irrigated agriculture, surface and groundwater overexploitation and infrastructure developments), resulting in the degradation of its quantitative and qualitative characteristics. As in most Mediterranean countries, water resource management in Alfeios River Basin has been focused up to now on an essentially supply-driven approach. It is still characterized by a lack of effective operational strategies. Authority responsibility relationships are fragmented, and law enforcement and policy implementation are weak. The present regulated water allocation puzzle entails a mixture of hydropower generation, irrigation, drinking water supply and recreational activities. Under these conditions its water resources management is characterised by high uncertainty and by vague and imprecise data. The considered methodology has been developed in order to deal with uncertainties expressed as either probability distributions, or/and fuzzy boundary intervals, derived by associated α-cut levels. In this framework a set of deterministic submodels is studied through linear programming. The ad hoc water resources management and alternative management patterns in an Alfeios subbasin are analyzed and evaluated under various scenarios, using the above mentioned methodology, aiming to promote a sustainable and equitable water management. Li, Y.P., Huang, G.H. and S.L., Nie, (2010), Planning water resources

  8. Prediction of selectivity from morphological conditions: Methodology and a case study on cod (Gadus morhua)

    DEFF Research Database (Denmark)

    Herrmann, Bent; Krag, Ludvig Ahm; Frandsen, Rikke

    2009-01-01

    The FISHSELECT methodology. tools, and software were developed and used to measure the morphological parameters that determine the ability of cod to penetrate different mesh types, sizes, and openings. The shape of one cross-section at the cod's head was found to explain 97.6% of the mesh...

  9. How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?

    Science.gov (United States)

    Risk, D. A.; O'Connell, L.; Atherton, E.

    2017-12-01

    In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over

  10. Abnormal condition and events analysis for instrumentation and control systems. Volume 1: Methodology for nuclear power plant digital upgrades. Final report

    International Nuclear Information System (INIS)

    McKemy, S.; Marcelli, M.; Boehmer, N.; Crandall, D.

    1996-01-01

    The ACES project was initiated to identify a cost-effective methodology for addressing abnormal conditions and events (ACES) in digital upgrades to nuclear power plant systems, as introduced by IEEE Standard 7-4.3.2-1993. Several methodologies and techniques currently in use in the defense, aerospace, and other communities for the assurance of digital safety systems were surveyed, and although several were shown to possess desirable qualities, non sufficiently met the needs of the nuclear power industry. This report describes a tailorable methodology for performing ACES analysis that is based on the more desirable aspects of the reviewed methodologies and techniques. The methodology is applicable to both safety- and non-safety-grade systems, addresses hardware, software, and system-level concerns, and can be applied in either a lifecycle or post-design timeframe. Employing this methodology for safety systems should facilitate the digital upgrade licensing process

  11. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    Science.gov (United States)

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  12. Determination of Optimum Condition of Leucine Content in Beef Protein Hydrolysate using Response Surface Methodology

    International Nuclear Information System (INIS)

    Siti Roha Ab Mutalib; Zainal Samicho; Noriham Abdullah

    2016-01-01

    The aim of this study is to determine the optimum condition of leucine content in beef hydrolysate. Beef hydrolysate was prepared by enzymatic hydrolysis using bromelain enzyme produced from pineapple peel. Parameter conditions such as concentration of bromelain, hydrolysis temperature and hydrolysis time were assessed to obtain the optimum leucine content of beef hydrolysate according to experimental design which was recommended by response surface methodology (RSM). Leucine content in beef hydrolysate was determined using AccQ. Tag amino acid analysis method using high performance liquid chromatography (HPLC). The condition of optimum leucine content was at bromelain concentration of 1.38 %, hydrolysis temperature of 42.5 degree Celcius and hydrolysis time of 31.59 hours with the predicted leucine content of 26.57 %. The optimum condition was verified with the leucine value obtained was 26.25 %. Since there was no significant difference (p>0.05) between the predicted and verified leucine values, thus it indicates that the predicted optimum condition by RSM can be accepted to predict the optimum leucine content in beef hydrolysate. (author)

  13. Methodology of ionizing radiation measurement, from x-ray equipment, for radiation protection

    International Nuclear Information System (INIS)

    Caballero, Katia C.S.; Borges, Jose C.

    1996-01-01

    Most of X-rays beam used for diagnostic, are short exposure time (milliseconds). Exception are those used in fluoroscopy. measuring instruments (area monitors with ionizing chambers or Geiger tubes) used in hospitals and clinics, in general, have characteristic answer time not adequate to X-rays beams length in time. Our objective was to analyse instruments available commercially, to prepare a measuring methodology for direct and secondary beams, in order to evaluate protection barriers for beams used in diagnostic radiology installations. (author)

  14. Conditioning Methodologies for DanceSport: Lessons from Gymnastics, Figure Skating, and Concert Dance Research.

    Science.gov (United States)

    Outevsky, David; Martin, Blake Cw

    2015-12-01

    Dancesport, the competitive branch of ballroom dancing, places high physiological and psychological demands on its practitioners, but pedagogical resources in these areas for this dance form are limited. Dancesport competitors could benefit from strategies used in other aesthetic sports. In this review, we identify conditioning methodologies from gymnastics, figure skating, and contemporary, modern, and ballet dance forms that could have relevance and suitability for dancesport training, and propose several strategies for inclusion in the current dancesport curriculum. We reviewed articles derived from Google Scholar, PubMed, ScienceDirect, Taylor & Francis Online, and Web of Science search engines and databases, with publication dates from 1979 to 2013. The keywords included MeSH terms: dancing, gymnastics, physiology, energy metabolism, physical endurance, and range of motion. Out of 47 papers examined, 41 papers met the inclusion criteria (validity of scientific methods, topic relevance, transferability to dancesport, publication date). Quality and validity of the data were assessed by examining the methodologies in each study and comparing studies on similar populations as well as across time using the PRISMA 2009 checklist and flowchart. The relevant research suggests that macro-cycle periodization planning, aerobic and anaerobic conditioning, range of motion and muscular endurance training, and performance psychology methods have potential for adaptation for dancesport training. Dancesport coaches may help their students fulfill their ambitions as competitive athletes and dance artists by adapting the relevant performance enhancement strategies from gymnastics, figure skating, and concert dance forms presented in this paper.

  15. Methodological considerations for measuring glucocorticoid metabolites in feathers

    Science.gov (United States)

    Berk, Sara A.; McGettrick, Julie R.; Hansen, Warren K.; Breuner, Creagh W.

    2016-01-01

    In recent years, researchers have begun to use corticosteroid metabolites in feathers (fCORT) as a metric of stress physiology in birds. However, there remain substantial questions about how to measure fCORT most accurately. Notably, small samples contain artificially high amounts of fCORT per millimetre of feather (the small sample artefact). Furthermore, it appears that fCORT is correlated with circulating plasma corticosterone only when levels are artificially elevated by the use of corticosterone implants. Here, we used several approaches to address current methodological issues with the measurement of fCORT. First, we verified that the small sample artefact exists across species and feather types. Second, we attempted to correct for this effect by increasing the amount of methanol relative to the amount of feather during extraction. We consistently detected more fCORT per millimetre or per milligram of feather in small samples than in large samples even when we adjusted methanol:feather concentrations. We also used high-performance liquid chromatography to identify hormone metabolites present in feathers and measured the reactivity of these metabolites against the most commonly used antibody for measuring fCORT. We verified that our antibody is mainly identifying corticosterone (CORT) in feathers, but other metabolites have significant cross-reactivity. Lastly, we measured faecal glucocorticoid metabolites in house sparrows and correlated these measurements with corticosteroid metabolites deposited in concurrently grown feathers; we found no correlation between faecal glucocorticoid metabolites and fCORT. We suggest that researchers should be cautious in their interpretation of fCORT in wild birds and should seek alternative validation methods to examine species-specific relationships between environmental challenges and fCORT. PMID:27335650

  16. A methodology for performing virtual measurements in a nuclear reactor system

    International Nuclear Information System (INIS)

    Ikonomopoulos, A.; Uhrig, R.E.; Tsoukalas, L.H.

    1992-01-01

    A novel methodology is presented for monitoring nonphysically measurable variables in an experimental nuclear reactor. It is based on the employment of artificial neural networks to generate fuzzy values. Neural networks map spatiotemporal information (in the form of time series) to algebraically defined membership functions. The entire process can be thought of as a virtual measurement. Through such virtual measurements the values of nondirectly monitored parameters with operational significance, e.g., transient-type, valve-position, or performance, can be determined. Generating membership functions is a crucial step in the development and practical utilization of fuzzy reasoning, a computational approach that offers the advantage of describing the state of the system in a condensed, linguistic form, convenient for monitoring, diagnostics, and control algorithms

  17. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  18. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    Science.gov (United States)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population

  19. Experimental assessment for instantaneous temperature and heat flux measurements under Diesel motored engine conditions

    International Nuclear Information System (INIS)

    Torregrosa, A.J.; Bermúdez, V.; Olmeda, P.; Fygueroa, O.

    2012-01-01

    Higlights: ► We measured in-cylinder wall heat fluxes. ► We examine the effects of different engine parameters. ► Increasing air mass flow increase heat fluxes. ► The effect of engine speed can be masked by the effect of volumetric efficiency. ► Differences among the different walls have been found. - Abstract: The main goal of this work is to validate an innovative experimental facility and to establish a methodology to evaluate the influence of some of the engine parameters on local engine heat transfer behaviour under motored steady-state conditions. Instantaneous temperature measurements have been performed in order to estimate heat fluxes on a modified Diesel single cylinder combustion chamber. This study was divided into two main parts. The first one was the design and setting on of an experimental bench to reproduce Diesel conditions and perform local-instantaneous temperature measurements along the walls of the combustion chamber by means of fast response thermocouples. The second one was the development of a procedure for temperature signal treatment and local heat flux calculation based on one-dimensional Fourier analysis. A thermodynamic diagnosis model has been employed to characterise the modified engine with the new designed chamber. As a result of the measured data coherent findings have been obtained in order to understand local behaviour of heat transfer in an internal combustion engine, and the influence of engine parameters on local instantaneous temperature and heat flux, have been analysed.

  20. Performance evaluation of CT measurements made on step gauges using statistical methodologies

    DEFF Research Database (Denmark)

    Angel, J.; De Chiffre, L.; Kruth, J.P.

    2015-01-01

    In this paper, a study is presented in which statistical methodologies were applied to evaluate the measurement of step gauges on an X-ray computed tomography (CT) system. In particular, the effects of step gauge material density and orientation were investigated. The step gauges consist of uni......- and bidirectional lengths. By confirming the repeatability of measurements made on the test system, the number of required scans in the design of experiment (DOE) was reduced. The statistical model was checked using model adequacy principles; model adequacy checking is an important step in validating...

  1. Improvement of the Assignment Methodology of the Approach Embankment Design to Highway Structures in Difficult Conditions

    Science.gov (United States)

    Chistyy, Y.; Kuzakhmetova, E.; Fazilova, Z.; Tsukanova, O.

    2018-03-01

    Design issues of junction of bridges and overhead road with approach embankment are studied. The reasons for the formation of deformations in the road structure are indicated. Activities to ensure sustainability and acceleration of the shrinkage of a weak subgrade approach embankment are listed. The necessity of taking into account the man-made impact of the approach embankment on the subgrade behavior is proved. Modern stabilizing agents to improve the properties of used soils in the embankment and the subgrade are suggested. Clarified methodology for determining an active zone of compression in the subgrade under load from the weight of the embankment is described. As an additional condition to the existing methodology for establishing the lower bound of the active zone of compression it is offered to accept the accuracy of evaluation of soil compressibility and determine shrinkage.

  2. Methodology for selection of attributes and operating conditions for SVM-Based fault locator's

    Directory of Open Access Journals (Sweden)

    Debbie Johan Arredondo Arteaga

    2017-01-01

    Full Text Available Context: Energy distribution companies must employ strategies to meet their timely and high quality service, and fault-locating techniques represent and agile alternative for restoring the electric service in the power distribution due to the size of distribution services (generally large and the usual interruptions in the service. However, these techniques are not robust enough and present some limitations in both computational cost and the mathematical description of the models they use. Method: This paper performs an analysis based on a Support Vector Machine for the evaluation of the proper conditions to adjust and validate a fault locator for distribution systems; so that it is possible to determine the minimum number of operating conditions that allow to achieve a good performance with a low computational effort. Results: We tested the proposed methodology in a prototypical distribution circuit, located in a rural area of Colombia. This circuit has a voltage of 34.5 KV and is subdivided in 20 zones. Additionally, the characteristics of the circuit allowed us to obtain a database of 630.000 records of single-phase faults and different operating conditions. As a result, we could determine that the locator showed a performance above 98% with 200 suitable selected operating conditions. Conclusions: It is possible to improve the performance of fault locators based on Support Vector Machine. Specifically, these improvements are achieved by properly selecting optimal operating conditions and attributes, since they directly affect the performance in terms of efficiency and the computational cost.

  3. Characteristic Rain Events: A Methodology for Improving the Amenity Value of Stormwater Control Measures

    DEFF Research Database (Denmark)

    Smit Andersen, Jonas; Lerer, Sara Maria; Backhaus, Antje

    2017-01-01

    of achieving amenity value is to stage the rainwater and thus bring it to the attention of the public. We present here a methodology for creating a selection of rain events that can help bridge between engineering and landscape architecture when dealing with staging of rainwater. The methodology uses......Local management of rainwater using stormwater control measures (SCMs) is gaining increased attention as a sustainable alternative and supplement to traditional sewer systems. Besides offering added utility values, many SCMs also offer a great potential for added amenity values. One way...... quantitative and statistical methods to select Characteristic Rain Events (CREs) for a range of frequent return periods: weekly, bi-weekly, monthly, bi-monthly, and a single rarer event occurring only every 1–10 years. The methodology for selecting CREs is flexible and can be adjusted to any climatic settings...

  4. The Influence of Measurement Methodology on the Accuracy of Electrical Waveform Distortion Analysis

    Science.gov (United States)

    Bartman, Jacek; Kwiatkowski, Bogdan

    2018-04-01

    The present paper covers a review of documents that specify measurement methods of voltage waveform distortion. It also presents measurement stages of waveform components that are uncommon in the classic fundamentals of electrotechnics and signal theory, including the creation process of groups and subgroups of harmonics and interharmonics. Moreover, the paper discusses selected distortion factors of periodic waveforms and presents analyses that compare the values of these distortion indices. The measurements were carried out in the cycle per cycle mode and the measurement methodology that was used complies with the IEC 61000-4-7 norm. The studies showed significant discrepancies between the values of analyzed parameters.

  5. An evaluation of analysis methodologies for predicting cleavage arrest of a deep crack in an RPV subjected to PTS loading conditions

    International Nuclear Information System (INIS)

    Keeney-Walker, J.; Bass, B.R.

    1992-01-01

    Several calculational procedures are compared for predicting cleavage arrest of a deep crack in the wall of a prototypical reactor pressure vessel (RPV) subjected to pressurized-thermal-shock (PTS) types of loading conditions. Three procedures examined in this study utilized the following models: (1) a static finite-element model (full bending); (2) a radially constrained static model; and (3) a thermoelastic dynamic finite-element model. A PTS transient loading condition was selected that produced a deep arrest of an axially-oriented initially shallow crack according to calculational results obtained from the static (full-bending) model. Results from the two static models were compared with those generated from the detailed thermoelastic dynamic finite-element analysis. The dynamic analyses modeled cleavage-crack propagation using node-release technique and an application-mode methodology based on dynamic fracture toughness curves generated from measured data. Comparisons presented here indicate that the degree to which dynamic solutions can be approximated by static models is highly dependent on several factors, including the material dynamic fracture curves and the propensity for cleavage reinitiation of the arrested crack under PTS loading conditions. Additional work is required to develop and validate a satisfactory dynamic fracture toughness model applicable to postcleavage arrest conditions in an RPV

  6. A new methodology for fault detection in rolling element bearings using singular spectrum analysis

    Directory of Open Access Journals (Sweden)

    Bugharbee Hussein Al

    2018-01-01

    Full Text Available This paper proposes a vibration-based methodology for fault detection in rolling element bearings, which is based on pure data analysis via singular spectrum method. The method suggests building a baseline space from feature vectors made of the signals measured in the healthy/baseline bearing condition. The feature vectors are made using the Euclidean norms of the first three PC’s found for the signals measured. Then, the lagged version of any new signal corresponding to a new (possibly faulty condition is projected onto this baseline feature space in order to assess its similarity to the baseline condition. The category of a new signal vector is determined based on the Mahalanobis distance (MD of its feature vector to the baseline space. A validation of the methodology is suggested based on the results from an experimental test rig. The results obtained confirm the effective performance of the suggested methodology. It is made of simple steps and is easy to apply with a perspective to make it automatic and suitable for commercial applications.

  7. Covariance methodology applied to 35S disintegration rate measurements by the CIEMAT/NIST method

    International Nuclear Information System (INIS)

    Koskinas, M.F.; Nascimento, T.S.; Yamazaki, I.M.; Dias, M.S.

    2014-01-01

    The Nuclear Metrology Laboratory (LMN) at IPEN is carrying out measurements in a LSC (Liquid Scintillation Counting system), applying the CIEMAT/NIST method. In this context 35 S is an important radionuclide for medical applications and it is difficult to be standardized by other primary methods due to low beta ray energy. The CIEMAT/NIST is a standard technique used by most metrology laboratories in order to improve accuracy and speed up beta emitter standardization. The focus of the present work was to apply the covariance methodology for determining the overall uncertainty in the 35 S disintegration rate. All partial uncertainties involved in the measurements were considered, taking into account all possible correlations between each pair of them. - Highlights: ► 35 S disintegration rate measured in Liquid Scintillator system using CIEMAT/NIST method. ► Covariance methodology applied to the overall uncertainty in the 35 S disintegration rate. ► Monte Carlo simulation was applied to determine 35 S activity in the 4πβ(PC)-γ coincidence system

  8. A methodology to assist in contingency planning for protection of nuclear power plants against land vehicle bombs

    International Nuclear Information System (INIS)

    James, J.W.; Goldman, L.A.; Lobner, P.R.; Finn, S.P.; Koch, T.H.; Veatch, J.D.

    1989-04-01

    This report provides a methodology which could be used by operators of licensed nuclear power reactors to address issues related to contingency planning for a land vehicle bomb, should such a threat arise. The methodology presented in this report provides a structured framework for understanding factors to be considered in contingency planning for a land vehicle bomb including: (1) system options available to maintain a safe condition, (2) associated components and equipment, (3) preferred system options for establishing and maintaining a safe shutdown condition, and (4) contingency measures to preserve the preferred system options. Example applications of the methodology for a Boiling Water Reactor (BWR) and Pressurized Water Reactor (PWR) are provided along with an example of contingency plan changes necessary for implementation of this methodology, a discussion of some contingency measures that can be used to limit land vehicle access, and a bibliography. 2 refs., 11 figs., 6 tabs

  9. Measurement and verification of low income energy efficiency programs in Brazil: Methodological challenges

    Energy Technology Data Exchange (ETDEWEB)

    Martino Jannuzzi, Gilberto De; Rodrigues da Silva, Ana Lucia; Melo, Conrado Augustus de; Paccola, Jose Angelo; Dourado Maia Gomes, Rodolfo (State Univ. of Campinas, International Energy Initiative (Brazil))

    2009-07-01

    Electric utilities in Brazil are investing about 80 million dollars annually in low-income energy efficiency programs, about half of their total compulsory investments in end-use efficiency programs under current regulation. Since 2007 the regulator has enforced the need to provide evaluation plans for the programs delivered. This paper presents the measurement and verification (MandV) methodology that has been developed to accommodate the characteristics of lighting and refrigerator programs that have been introduced in the Brazilian urban and peri-urban slums. A combination of household surveys, end-use measurements and metering at the transformers and grid levels were performed before and after the program implementation. The methodology has to accommodate the dynamics, housing, electrical wiring and connections of the population as well as their ability to pay for the electricity and program participation. Results obtained in slums in Rio de Janeiro are presented. Impacts of the programs were evaluated in energy terms to households and utilities. Feedback from the evaluations performed also permitted the improvement in the design of new programs for low-income households.

  10. Research on sorption behavior of radionuclides under shallow land environment. Mechanism and standard methodologies for measurement of distribution coefficients of radionuclides

    International Nuclear Information System (INIS)

    Sakamoto, Yoshiaki; Tanaka, Tadao; Takebe, Shinichi; Nagao, Seiya; Ogawa, Hiromichi; Komiya, Tomokazu; Hagiwara, Shigeru

    2001-01-01

    This study consists of two categories' research works. One is research on sorption mechanism of radionuclides with long half-life, which are Technetium-99, TRU elements and U series radionuclides, on soil and rocks, including a development of database of distribution coefficients of radionuclides. The database on the distribution coefficients of radionuclides with information about measurement conditions, such as shaking method, soil characteristics and solution composition, has been already opened to the public (JAERI-DATABASE 20001003). Another study is investigation on a standard methodology of the distribution coefficient of radionuclide on soils, rocks and engineering materials in Japan. (author)

  11. Statistical optimization for alkali pretreatment conditions of narrow-leaf cattail by response surface methodology

    Directory of Open Access Journals (Sweden)

    Arrisa Ruangmee

    2013-08-01

    Full Text Available Response surface methodology with central composite design was applied to optimize alkali pretreatment of narrow-leafcattail (Typha angustifolia. Joint effects of three independent variables; NaOH concentration (1-5%, temperature (60-100 ºC,and reaction time (30-150 min, were investigated to evaluate the increase in and the improvement of cellulosic componentscontained in the raw material after pretreatment. The combined optimum condition based on the cellulosic content obtainedfrom this study is: a concentration of 5% NaOH, a reaction time of 120 min, and a temperature of 100 ºC. This result has beenanalyzed employing ANOVA with a second order polynomial equation. The model was found to be significant and was able topredict accurately the response of strength at less than 5% error. Under this combined optimal condition, the desirable cellulosic content in the sample increased from 38.5 to 68.3%, while the unfavorable hemicellulosic content decreased from 37.6 to7.3%.

  12. Application of the accident management information needs methodology to a severe accident sequence

    International Nuclear Information System (INIS)

    Ward, L.W.; Hanson, D.J.; Nelson, W.R.; Solberg, D.E.

    1989-01-01

    The U.S. Nuclear Regulatory Commission is conducting an accident management research program that emphasizes the use of severe accident research to enhance the ability of plant operating personnel to effectively manage severe accidents. Hence, it is necessary to ensure that the plant instrumentation and information systems adequately provide this information to the operating staff during accident conditions. A methodology to identify and assess the information needs of the operating staff of a nuclear power plant during a severe accident has been developed. The methodology identifies (a) the information needs of the plant personnel during a wide range of accident conditions, (b) the existing plant measurements capable of supplying these information needs and minor additions to instrument and display systems that would enhance management capabilities, (c) measurement capabilities and limitations during severe accident conditions, and (d) areas in which the information systems could mislead plant personnel

  13. Application of the accident management information needs methodology to a severe accident sequence

    Energy Technology Data Exchange (ETDEWEB)

    Ward, L.W.; Hanson, D.J.; Nelson, W.R. (Idaho National Engineering Laboratory, Idaho Falls (USA)); Solberg, D.E. (Nuclear Regulatory Commission, Washington, DC (USA))

    1989-11-01

    The U.S. Nuclear Regulatory Commission is conducting an accident management research program that emphasizes the use of severe accident research to enhance the ability of plant operating personnel to effectively manage severe accidents. Hence, it is necessary to ensure that the plant instrumentation and information systems adequately provide this information to the operating staff during accident conditions. A methodology to identify and assess the information needs of the operating staff of a nuclear power plant during a severe accident has been developed. The methodology identifies (a) the information needs of the plant personnel during a wide range of accident conditions, (b) the existing plant measurements capable of supplying these information needs and minor additions to instrument and display systems that would enhance management capabilities, (c) measurement capabilities and limitations during severe accident conditions, and (d) areas in which the information systems could mislead plant personnel.

  14. A new transmission methodology for quality assurance in radiotherapy based on radiochromic film measurements

    Science.gov (United States)

    do Amaral, Leonardo L.; Pavoni, Juliana F.; Sampaio, Francisco; Netto, Thomaz Ghilardi

    2015-01-01

    Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source‐to‐detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92%±0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85%±0.26% and the mean percentage differences in the normalization point doses were −1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments. PACS number: 87.55.Qr, 87.55.km, 87.55.N‐ PMID

  15. An ultrasonic methodology for muscle cross section measurement of support space flight

    Science.gov (United States)

    Hatfield, Thomas R.; Klaus, David M.; Simske, Steven J.

    2004-09-01

    The number one priority for any manned space mission is the health and safety of its crew. The study of the short and long term physiological effects on humans is paramount to ensuring crew health and mission success. One of the challenges associated in studying the physiological effects of space flight on humans, such as loss of bone and muscle mass, has been that of readily attaining the data needed to characterize the changes. The small sampling size of astronauts, together with the fact that most physiological data collection tends to be rather tedious, continues to hinder elucidation of the underlying mechanisms responsible for the observed changes that occur in space. Better characterization of the muscle loss experienced by astronauts requires that new technologies be implemented. To this end, we have begun to validate a 360° ultrasonic scanning methodology for muscle measurements and have performed empirical sampling of a limb surrogate for comparison. Ultrasonic wave propagation was simulated using 144 stations of rotated arm and calf MRI images. These simulations were intended to provide a preliminary check of the scanning methodology and data analysis before its implementation with hardware. Pulse-echo waveforms were processed for each rotation station to characterize fat, muscle, bone, and limb boundary interfaces. The percentage error between MRI reference values and calculated muscle areas, as determined from reflection points for calf and arm cross sections, was -2.179% and +2.129%, respectively. These successful simulations suggest that ultrasound pulse scanning can be used to effectively determine limb cross-sectional areas. Cross-sectional images of a limb surrogate were then used to simulate signal measurements at several rotation angles, with ultrasonic pulse-echo sampling performed experimentally at the same stations on the actual limb surrogate to corroborate the results. The objective of the surrogate sampling was to compare the signal

  16. Application of the accident management information needs methodology to a severe accident sequence

    International Nuclear Information System (INIS)

    Ward, L.W.; Hanson, D.J.; Nelson, W.R.; Solberg, D.E.

    1989-01-01

    The U.S. Nuclear Regulatory Commission (NRC) is conducting an Accident Management Research Program that emphasizes the application of severe accident research results to enhance the capability of plant operating personnel to effectively manage severe accidents. A methodology to identify and assess the information needs of the operating staff of a nuclear power plant during a severe accident has been developed as part of the research program designed to resolve this issue. The methodology identifies the information needs of the plant personnel during a wide range of accident conditions, the existing plant measurements capable of supplying these information needs and what, if any minor additions to instrument and display systems would enhance the capability to manage accidents, known limitations on the capability of these measurements to function properly under the conditions that will be present during a wide range of severe accidents, and areas in which the information systems could mislead plant personnel. This paper presents an application of this methodology to a severe accident sequence to demonstrate its use in identifying the information which is available for management of the event. The methodology has been applied to a severe accident sequence in a Pressurized Water Reactor with a large dry containment. An examination of the capability of the existing measurements was then performed to determine whether the information needs can be supplied

  17. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  18. Atmospheric aerosol in an urban area: Comparison of measurement instruments and methodologies and pulmonary deposition assessment

    International Nuclear Information System (INIS)

    Berico, M.; Luciani, A.; Formignani, M.

    1996-07-01

    In March 1995 a measurement campaign of atmospheric aerosol in the Bologna urban area (Italy) was carried out. A transportable laboratory, set up by ENEA (Italian national Agency for New Technologies, Energy and the Environment) Environmental Department (Bologna), was utilized with instruments for measurement of atmospheric aerosol and meteorological parameters. The aim of this campaign was of dual purpose: to characterize aerosol in urban area and to compare different instruments and methodologies of measurements. Mass concentrations measurements, evaluated on a 23-hour period with total filter, PM10 dichotomous sampler and low pressure impactor (LPI Berner), have provided information respectively about total suspended particles, respirable fraction and granulometric parameters of aerosol. Eight meteorologic parameters, number concentration of submicromic fraction of aerosol and mass concentration of micromic fraction have been continually measured. Then, in a daytime period, several number granulometries of atmospheric aerosol have also been estimated by means of diffusion battery system. Results related to different measurement methodologies and granulometric characteristics of aerosol are presented here. Pulmonary deposition of atmospheric aerosol is finally calculated, using granulometries provided by LPI Brener and ICRP 66 human respiratory tract model

  19. Methodological foundations of evaluation of effectiveness indicators of small-scale business activities

    Directory of Open Access Journals (Sweden)

    Ivanova T.

    2013-01-01

    Full Text Available The methodological approach to the measurement of financial indicators of small-scale enterprises has been developed. It enables to secure the comparability of financial condition indicators and the results of small-scale enterprise activities, and also to develop the methods of vertical integral estimate calculation at separate aspects of financial condition and the results of smallscale enterprise activities.

  20. A methodology for supporting decisions on the establishment of protective measures after severe nuclear accidents. Final report

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Kollas, J.G.

    1994-06-01

    Full text: The objective of this report is to demonstrate the use of a methodology supporting decisions on protective measures following severe nuclear accidents. A multicriteria decision analysis approach is recommended where value tradeoffs are postponed until the very last stage of the decision process. Use of efficient frontiers is made to exclude all technically inferior solutions and present the decision maker with all non-dominated solutions. A choice among these solutions implies a value trade-off among the multiple criteria. An interactive computer package has been developed where the decision maker can choose a point on the efficient frontier in the consequence space and immediately see the alternative in the decision space resulting in the chosen consequences. The methodology is demonstrated through an application on the choice among possible protective measures in contaminated areas of the former USSR after the Chernobyl accident. Two distinct cases are considered: First a decision is to be made only on the basis of the level of soil contamination with Cs-137 and the total cost of the chosen protective policy; Next the decision is based on the geographic dimension of the contamination and the total cost. Three alternative countermeasure actions are considered for population segments living on soil contaminated at a certain level or in a specific geographic region: (a) relocation of the population; (b) improvement of the living conditions; and, (c) no countermeasures at all. This is the final deliverable of the CEC-CIS Joint Study Project 2, Task 5: Decision-Aiding-System for Establishing Intervention Levels, performed under Contracts COSU-CT91-0007 and COSU-CT92-0021 with the Commission of European Communities through CEPN. (author)

  1. The bounds on tracking performance utilising a laser-based linear and angular sensing and measurement methodology for micro/nano manipulation

    International Nuclear Information System (INIS)

    Clark, Leon; Shirinzadeh, Bijan; Tian, Yanling; Zhong, Yongmin

    2014-01-01

    This paper presents an analysis of the tracking performance of a planar three degrees of freedom (DOF) flexure-based mechanism for micro/nano manipulation, utilising a tracking methodology for the measurement of coupled linear and angular motions. The methodology permits trajectories over a workspace with large angular range through the reduction of geometric errors. However, when combining this methodology with feedback control systems, the accuracy of performed manipulations can only be stated within the bounds of the uncertainties in measurement. The dominant sources of error and uncertainty within each sensing subsystem are therefore identified, which leads to a formulation of the measurement uncertainty in the final system outputs, in addition to methods of reducing their magnitude. Specific attention is paid to the analysis of the vision-based subsystem utilised for the measurement of angular displacement. Furthermore, a feedback control scheme is employed to minimise tracking errors, and the coupling of certain measurement errors is shown to have a detrimental effect on the controller operation. The combination of controller tracking errors and measurement uncertainty provides the bounds on the final tracking performance. (paper)

  2. Questionnaire on the measurement condition of distribution coefficient

    International Nuclear Information System (INIS)

    Takebe, Shinichi; Kimura, Hideo; Matsuzuru, Hideo

    2001-05-01

    The distribution coefficient is used for various transport models to evaluate the migration behavior of radionuclides in the environment and is very important parameter in environmental impact assessment of nuclear facility. The questionnaire was carried out for the purpose of utilizing for the proposal of the standard measuring method of distribution coefficient. This report is summarized the result of questionnairing on the sampling methods and storage condition, the pretreatment methods, the analysis items in the physical/chemical characteristics of the sample, and the distribution coefficient measuring method and the measurement conditions in the research institutes within country. (author)

  3. OPTIMIZATION OF MICROWAVE AND AIR DRYING CONDITIONS OF QUINCE (CYDONIA OBLONGA, MILLER USING RESPONSE SURFACE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Cem Baltacioglu

    2015-03-01

    Full Text Available Effects of slice thickness of quince (Cydonia oblonga Miller , microwave incident power and air drying temperature on antioxidant activity and total phenolic content of quince were investigated during drying in microwave and air drying. Optimum conditions were found to be: i for microwave drying, 285 W and 4.14 mm thick (maximum antioxidant activity and 285 W and 6.85 mm thick (maximum total phenolic content, and ii for air drying, 75 ºC and 1.2 mm thick (both maximum antioxidant activity and total phenolic content. Drying conditions were optimized by using the response surface methodology. 13 experiments were carried out considering incident microwave powers from 285 to 795 W, air temperature from 46 to 74 ºC and slice thickness from 1.2 to 6.8 mm.

  4. Luminosity measurement and beam condition monitoring at CMS

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, Jessica Lynn [DESY, Zeuthen (Germany)

    2015-07-01

    The BRIL system of CMS consists of instrumentation to measure the luminosity online and offline, and to monitor the LHC beam conditions inside CMS. An accurate luminosity measurement is essential to the CMS physics program, and measurement of the beam background is necessary to ensure safe operation of CMS. In expectation of higher luminosity and denser proton bunch spacing during LHC Run II, many of the BRIL subsystems are being upgraded and others are being added to complement the existing measurements. The beam condition monitor (BCM) consists of several sets of diamond sensors used to measure online luminosity and beam background with a single-bunch-crossing resolution. The BCM also detects when beam conditions become unfavorable for CMS running and may trigger a beam abort to protect the detector. The beam halo monitor (BHM) uses quartz bars to measure the background of the incoming beams at larger radii. The pixel luminosity telescope (PLT) consists of telescopes of silicon sensors designed to provide a CMS online and offline luminosity measurement. In addition, the forward hadronic calorimeter (HF) will deliver an independent luminosity measurement, making the whole system robust and allowing for cross-checks of the systematics. Data from each of the subsystems will be collected and combined in the BRIL DAQ framework, which will publish it to CMS and LHC. The current status of installation and commissioning results for the BRIL subsystems are given.

  5. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios

    NARCIS (Netherlands)

    Schaafstra, F J W C; van Doorn, D A; Schonewille, J T; van Riet, M M J; Hendriks, W H

    2017-01-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE)

  6. Radioactivity measurement of the liquid effluents of two university hospital methodology, problems arising

    International Nuclear Information System (INIS)

    Basse-Cathalinat, B.; Barthe, N.; Chatti, K.; Ducassou, D.

    2005-01-01

    The authors present methodology used to measure the radioactivity of the effluents at the output of two services of Nuclear medicine located in two Hospital complexes of the Area of Bordeaux. These measures are intended to answer at the requests of circular DGS/DHOS no 2001/323 of the Ministry for Employment and Solidarity. The selected method is more powerful since it is based on the use of a whole of spectrometry to very low background noise. These devices of measurements make it possible to take into account all the isotopes coming from a service of Nuclear medicine. The authors are conscious that of such measurements cannot be considered in all the services of Nuclear medicine. Other technical articles will specify simpler methods allowing a satisfactory management of the radioactive wastes. (author)

  7. Measurement of testosterone in human sexuality research: methodological considerations.

    Science.gov (United States)

    van Anders, Sari M; Goldey, Katherine L; Bell, Sarah N

    2014-02-01

    Testosterone (T) and other androgens are incorporated into an increasingly wide array of human sexuality research, but there are a number of issues that can affect or confound research outcomes. This review addresses various methodological issues relevant to research design in human studies with T; unaddressed, these issues may introduce unwanted noise, error, or conceptual barriers to interpreting results. Topics covered are (1) social and demographic factors (gender and sex; sexual orientations and sexual diversity; social/familial connections and processes; social location variables), (2) biological rhythms (diurnal variation; seasonality; menstrual cycles; aging and menopause), (3) sample collection, handling, and storage (saliva vs. blood; sialogogues, saliva, and tubes; sampling frequency, timing, and context; shipping samples), (4) health, medical issues, and the body (hormonal contraceptives; medications and nicotine; health conditions and stress; body composition, weight, and exercise), and (5) incorporating multiple hormones. Detailing a comprehensive set of important issues and relevant empirical evidence, this review provides a starting point for best practices in human sexuality research with T and other androgens that may be especially useful for those new to hormone research.

  8. Development of a methodology for conducting an integrated HRA/PRA --. Task 1, An assessment of human reliability influences during LP&S conditions PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., McLean, VA (United States)

    1993-06-01

    During Low Power and Shutdown (LP&S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant`s systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP&S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP&S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP&S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP&S, (2) identification of potentially important LP&S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP&S conditions for a pressurized water reactor (PWR).

  9. Translation and linguistic validation of the Pediatric Patient-Reported Outcomes Measurement Information System measures into simplified Chinese using cognitive interviewing methodology.

    Science.gov (United States)

    Liu, Yanyan; Hinds, Pamela S; Wang, Jichuan; Correia, Helena; Du, Shizheng; Ding, Jian; Gao, Wen Jun; Yuan, Changrong

    2013-01-01

    The Pediatric Patient-Reported Outcomes Measurement Information System (PROMIS) measures were developed using modern measurement theory and tested in a variety of settings to assess the quality of life, function, and symptoms of children and adolescents experiencing a chronic illness and its treatment. Developed in English, this set of measures had not been translated into Chinese. The objective of this study was to develop the Chinese version of the Pediatric PROMIS measures (C-Ped-PROMIS), specifically 8 short forms, and to pretest the translated measures in children and adolescents through cognitive interviewing methodology. The C-Ped-PROMIS was developed following the standard Functional Assessment of Chronic Illness Therapy Translation Methodology. Bilingual teams from the United States and China reviewed the translation to develop a provisional version, which was then pretested with cognitive interview by probing 10 native Chinese-speaking children aged 8 to 17 years in China. The translation was finalized by the bilingual teams. Most items, response options, and instructions were well understood by the children, and some revisions were made to address patient's comments during the cognitive interview. The results indicated that the C-Ped-PROMIS items were semantically and conceptually equivalent to the original. Children aged 8 to 17 years in China were able to comprehend these measures and express their experience and feelings about illness or their life. The C-Ped-PROMIS is available for psychometric validation. Future work will be directed at translating the rest of the item banks, calibrating them and creating a Chinese final version of the short forms.

  10. Advanced haptic sensor for measuring human skin conditions

    Science.gov (United States)

    Tsuchimi, Daisuke; Okuyama, Takeshi; Tanaka, Mami

    2010-01-01

    This paper is concerned with the development of a tactile sensor using PVDF (Polyvinylidene Fluoride) film as a sensory receptor of the sensor to evaluate softness, smoothness, and stickiness of human skin. Tactile sense is the most important sense in the sensation receptor of the human body along with eyesight, and we can examine skin condition quickly using these sense. But, its subjectivity and ambiguity make it difficult to quantify skin conditions. Therefore, development of measurement device which can evaluate skin conditions easily and objectively is demanded by dermatologists, cosmetic industries, and so on. In this paper, an advanced haptic sensor system that can measure multiple information of skin condition in various parts of human body is developed. The applications of the sensor system to evaluate softness, smoothness, and stickiness of skin are investigated through two experiments.

  11. Evaluation of methodological aspects of digestibility measurements in ponies fed different haylage to concentrate ratios

    NARCIS (Netherlands)

    Schaafstra, F.J.W.C.; Doorn, van D.A.; Schonewille, J.T.; Riet, van M.M.J.; Visser, P.; Blok, M.C.; Hendriks, W.H.

    2017-01-01

    Methodological aspects of digestibility measurements were studied in four Welsh pony geldings consuming haylage-based diets with increasing proportions of a pelleted concentrate according to a 4×4 Latin square design experiment. Ponies were fed four experimental, iso-energetic (net energy (NE)

  12. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  13. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  14. Methodology for application of field rainfall simulator to revise c-factor database for conditions of the Czech Republic

    Science.gov (United States)

    Neumann, Martin; Dostál, Tomáš; Krása, Josef; Kavka, Petr; Davidová, Tereza; Brant, Václav; Kroulík, Milan; Mistr, Martin; Novotný, Ivan

    2016-04-01

    The presentation will introduce a methodology of determination of crop and cover management factor (C-faktor) for the universal soil loss equation (USLE) using field rainfall simulator. The aim of the project is to determine the C-factor value for the different phenophases of the main crops of the central-european region, while also taking into account the different agrotechnical methods. By using the field rainfall simulator, it is possible to perform the measurements in specific phenophases, which is otherwise difficult to execute due to the variability and fortuity of the natural rainfall. Due to the number of measurements needed, two identical simulators will be used, operated by two independent teams, with coordinated methodology. The methodology will mainly specify the length of simulation, the rainfall intensity, and the sampling technique. The presentation includes a more detailed account of the methods selected. Due to the wide range of variable crops and soils, it is not possible to execute the measurements for all possible combinations. We therefore decided to perform the measurements for previously selected combinations of soils,crops and agrotechnologies that are the most common in the Czech Republic. During the experiments, the volume of the surface runoff and amount of sediment will be measured in their temporal distribution, as well as several other important parameters. The key values of the 3D matrix of the combinations of the crop, agrotechnique and soil will be determined experimentally. The remaining values will be determined by interpolation or by a model analogy. There are several methods used for C-factor calculation from measured experimental data. Some of these are not suitable to be used considering the type of data gathered. The presentation will discuss the benefits and drawbacks of these methods, as well as the final design of the method used. The problems concerning the selection of a relevant measurement method as well as the final

  15. Measuring hand hygiene compliance rates in different special care settings: a comparative study of methodologies

    Directory of Open Access Journals (Sweden)

    Thyago Pereira Magnus

    2015-04-01

    Conclusions: Hand hygiene compliance was reasonably high in these units, as measured by direct observation. However, a lack of correlation with results obtained by other methodologies brings into question the validity of direct observation results, and suggests that periodic audits using other methods may be needed.

  16. The Self-Concept. Volume 1, A Review of Methodological Considerations and Measuring Instruments. Revised Edition.

    Science.gov (United States)

    Wylie, Ruth C.

    This volume of the revised edition describes and evaluates measurement methods, research designs, and procedures which have been or might appropriately be used in self-concept research. Working from the perspective that self-concept or phenomenal personality theories can be scientifically investigated, methodological flaws and questionable…

  17. Methodology for assessing the impacts of distributed generation interconnection

    Directory of Open Access Journals (Sweden)

    Luis E. Luna

    2011-06-01

    Full Text Available This paper proposes a methodology for identifying and assessing the impact of distributed generation interconnection on distribution systems using Monte Carlo techniques. This methodology consists of two analysis schemes: a technical analysis, which evaluates the reliability conditions of the distribution system; on the other hand, an economic analysis that evaluates the financial impacts on the electric utility and its customers, according to the system reliability level. The proposed methodology was applied to an IEEE test distribution system, considering different operation schemes for the distributed generation interconnection. The application of each one of these schemes provided significant improvements regarding the reliability and important economic benefits for the electric utility. However, such schemes resulted in negative profitability levels for certain customers, therefore, regulatory measures and bilateral contracts were proposed which would provide a solution for this kind of problem.

  18. Conditions of viscosity measurement for detecting irradiated peppers

    International Nuclear Information System (INIS)

    Hayashi, Toru; Todoriki, Setsuko; Okadome, Hiroshi; Kohyama, Kaoru

    1995-01-01

    Viscosity of gelatinized suspensions of black and white peppers decreased depending upon dose. The viscosity was influenced by gelatinization and viscosity measurement conditions. The difference between unirradiated pepper and an irradiated one was larger at a higher pH and temperature for gelatinization. A viscosity parameter normalized with the starch content of pepper sample and the viscosity of a 5% suspension of corn starch could get rid of the influence of the conditions for viscosity measurement such as type of viscometer, shear rate and temperature. (author)

  19. METHODOLOGICAL APPROACHES TO FORMATION OF CONDITIONS OF TRANSITION TO STEADY DEVELOPMENT OF THE CREDIT ORGANIZATIONS OF REGION

    Directory of Open Access Journals (Sweden)

    O.I. Pechonik

    2006-03-01

    Full Text Available Formation of conditions of transition to steady development of the credit organizations assumes presence of scientific toolkit which should have methodological character and represent a set of scientific receptions, methods and principles of research to which definition given clause is devoted. The executed research has shown, that the logic and the scheme of the scientific analysis of processes of maintenance with bank service of economic system of region and formation of conditions of steady development of regional bank system should: to be based on statistical methods with use of system of national accounts in addition with the SWOT-analysis of bank system; formation of conditions of transition to steady development to be spent in a complex and comprehensively; management of process of transition to steady development of bank system should be carried out at active state participation within the limits of creation socially focused according to plan-market economy. At the given approach formation of conditions of transition of regional bank system on steady development, in our opinion, becomes possible.

  20. Methodology for sample preparation and size measurement of commercial ZnO nanoparticles

    Directory of Open Access Journals (Sweden)

    Pei-Jia Lu

    2018-04-01

    Full Text Available This study discusses the strategies on sample preparation to acquire images with sufficient quality for size characterization by scanning electron microscope (SEM using two commercial ZnO nanoparticles of different surface properties as a demonstration. The central idea is that micrometer sized aggregates of ZnO in powdered forms need to firstly be broken down to nanosized particles through an appropriate process to generate nanoparticle dispersion before being deposited on a flat surface for SEM observation. Analytical tools such as contact angle, dynamic light scattering and zeta potential have been utilized to optimize the procedure for sample preparation and to check the quality of the results. Meanwhile, measurements of zeta potential values on flat surfaces also provide critical information and save lots of time and efforts in selection of suitable substrate for particles of different properties to be attracted and kept on the surface without further aggregation. This simple, low-cost methodology can be generally applied on size characterization of commercial ZnO nanoparticles with limited information from vendors. Keywords: Zinc oxide, Nanoparticles, Methodology

  1. Radon flux measurement methodologies

    International Nuclear Information System (INIS)

    Nielson, K.K.; Rogers, V.C.

    1984-01-01

    Five methods for measuring radon fluxes are evaluated: the accumulator can, a small charcoal sampler, a large-area charcoal sampler, the ''Big Louie'' charcoal sampler, and the charcoal tent sampler. An experimental comparison of the five flux measurement techniques was also conducted. Excellent agreement was obtained between the measured radon fluxes and fluxes predicted from radium and emanation measurements

  2. Muscle dysmorphia: methodological issues, implications for research.

    Science.gov (United States)

    Suffolk, Mark T; Dovey, Terence M; Goodwin, Huw; Meyer, Caroline

    2013-01-01

    Muscle dysmorphia is a male-dominated, body image-related psychological condition. Despite continued investigation, contention surrounds the nosological status of this disorder. The aim of this article was to review the literature on muscle dysmorphia to provide a qualitative account of methodological issues that may inhibit our understanding. Key areas relating to non-standardized participant groups, measuring instruments, and terminology were identified as potentially inhibiting symptom coherence and diagnostic reliability. New measuring instruments validated with clinical samples and carefully described participant groups, standardized terminology, and a greater emphasis on prospective longitudinal research with specific sub groups of the weight training community would be of interest to the field.

  3. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist.

    Science.gov (United States)

    Terwee, Caroline B; Mokkink, Lidwine B; Knol, Dirk L; Ostelo, Raymond W J G; Bouter, Lex M; de Vet, Henrica C W

    2012-05-01

    The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. The scoring system was developed based on discussions among experts and testing of the scoring system on 46 articles from a systematic review. Four response options were defined for each COSMIN item (excellent, good, fair, and poor). A quality score per measurement property is obtained by taking the lowest rating of any item in a box ("worst score counts"). Specific criteria for excellent, good, fair, and poor quality for each COSMIN item are described. In defining the criteria, the "worst score counts" algorithm was taken into consideration. This means that only fatal flaws were defined as poor quality. The scores of the 46 articles show how the scoring system can be used to provide an overview of the methodological quality of studies included in a systematic review of measurement properties. Based on experience in testing this scoring system on 46 articles, the COSMIN checklist with the proposed scoring system seems to be a useful tool for assessing the methodological quality of studies included in systematic reviews of measurement properties.

  4. Presentation of a methodology for measuring social acceptance of three hydrogen storage technologies and preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Noirot, I.; Bigay, C. N.

    2005-07-01

    Technologies (MASIT). This methodology takes into account the following points of view : technical, economical, environmental, social and industrial/technological risks. MASIT is the methodology chosen to assess the hydrogen storage technologies developed during the StorHy project. With respect to the methodology, each point of view is defined by several criteria selected with car manufacturers and experts of each domain. Then, each criterion is quantified with the contribution of all partners involved in the project. While technical, economical and environmental criteria are quite objectives (easy to define and measure), the social dimension is subjective and has also a large variability as it depends on perception and measurement at an individual human level. So, the methodological work consists in the improvement of the MASIT methodology from the social point of view. This methodology is applicable for comparison of any other technologies and it has been implemented here to compare the storage technologies developed in the StorHy project for each application selected in the study (light vehicles, fleet vehicles, buses). (Author)

  5. Dielectric Barrier Discharge (DBD) Plasma Actuators Thrust-Measurement Methodology Incorporating New Anti-Thrust Hypothesis

    Science.gov (United States)

    Ashpis, David E.; Laun, Matthew C.

    2014-01-01

    We discuss thrust measurements of Dielectric Barrier Discharge (DBD) plasma actuators devices used for aerodynamic active flow control. After a review of our experience with conventional thrust measurement and significant non-repeatability of the results, we devised a suspended actuator test setup, and now present a methodology of thrust measurements with decreased uncertainty. The methodology consists of frequency scans at constant voltages. The procedure consists of increasing the frequency in a step-wise fashion from several Hz to the maximum frequency of several kHz, followed by frequency decrease back down to the start frequency of several Hz. This sequence is performed first at the highest voltage of interest, then repeated at lower voltages. The data in the descending frequency direction is more consistent and selected for reporting. Sample results show strong dependence of thrust on humidity which also affects the consistency and fluctuations of the measurements. We also observed negative values of thrust or "anti-thrust", at low frequencies between 4 Hz and up to 64 Hz. The anti-thrust is proportional to the mean-squared voltage and is frequency independent. Departures from the parabolic anti-thrust curve are correlated with appearance of visible plasma discharges. We propose the anti-thrust hypothesis. It states that the measured thrust is a sum of plasma thrust and anti-thrust, and assumes that the anti-thrust exists at all frequencies and voltages. The anti-thrust depends on actuator geometry and materials and on the test installation. It enables the separation of the plasma thrust from the measured total thrust. This approach enables more meaningful comparisons between actuators at different installations and laboratories. The dependence on test installation was validated by surrounding the actuator with a large diameter, grounded, metal sleeve.

  6. Use of cesium-137 methodology in the evaluation of superficial erosive processes

    International Nuclear Information System (INIS)

    Andrello, Avacir Casanova; Appoloni, Carlos Roberto; Guimaraes, Maria de Fatima; Nascimento Filho, Virgilio Franco do

    2003-01-01

    Superficial erosion is one of the main soil degradation agents and erosion rates estimations for different edaphic climate conditions for the conventional models, as USLE and RUSLE, are expensive and time-consuming. The use of cesium- 137 anthropogenic radionuclide is a new methodology that has been much studied and its application in the erosion soil evaluation has grown in countries as USA, UK, Australia and others. A brief narration of this methodology is being presented, as the development of the equations utilized for the erosion rates quantification through the cesium- 137 measurements. Two watersheds studied in Brazil have shown that the cesium- 137 methodology was practicable and coherent with the survey in field for applications in erosion studies. (author)

  7. The effects of overtime work and task complexity on the performance of nuclear plant operators: A proposed methodology

    International Nuclear Information System (INIS)

    Banks, W.W.; Potash, L.

    1985-01-01

    This document presents a very general methodology for determining the effect of overtime work and task complexity on operator performance in response to simulated out-of-limit nuclear plant conditions. The independent variables consist of three levels of overtime work and three levels of task complexity. Multiple dependent performance measures are proposed for use and discussion. Overtime work is operationally defined in terms of the number of hours worked by nuclear plant operators beyond the traditional 8 hours per shift. Task complexity is operationalized in terms of the number of operator tasks required to remedy a given plant anomalous condition and bring the plant back to a ''within limits'' or ''normal'' steady-state condition. The proposed methodology would employ a 2 factor repeated measures design along with the analysis of variance (linear) model

  8. Estimation of Aerodynamic Parameters in Conditions of Measurement

    Directory of Open Access Journals (Sweden)

    Htang Om Moung

    2017-01-01

    Full Text Available The paper discusses the problem of aircraft parameter identification in conditions of measurement noises. It is assumed that all the signals involved into the process of identification are subjects to measurement noises, that is measurement random errors normally distributed. The results of simulation are presented which show the relation between the noises standard deviations and the accuracy of identification.

  9. Measurement of Quality of Life I. A Methodological Framework

    Directory of Open Access Journals (Sweden)

    Soren Ventegodt

    2003-01-01

    Full Text Available Despite the widespread acceptance of quality of life (QOL as the ideal guideline in healthcare and clinical research, serious conceptual and methodological problems continue to plague this area. In an attempt to remedy this situation, we propose seven criteria that a quality-of-life concept must meet to provide a sound basis for investigation by questionnaire. The seven criteria or desiderata are: (1 an explicit definition of quality of life; (2 a coherent philosophy of human life from which the definition is derived; (3 a theory that operationalizes the philosophy by specifying unambiguous, nonoverlapping, and jointly exhaustive questionnaire items; (4 response alternatives that permit a fraction-scale interpretation; (5 technical checks of reproducibility; (6 meaningfulness to investigators, respondents, and users; and (7 an overall aesthetic appeal of the questionnaire. These criteria have guided the design of a validated 5-item generic, global quality-of-life questionnaire (QOL5, and a validated 317-item generic, global quality-of-life questionnaire (SEQOL, administered to a well-documented birth cohort of 7,400 Danes born in 1959�1961, as well as to a reference sample of 2,500 Danes. Presented in outline, the underlying integrative quality-of-life (IQOL theory is a meta-theory. To illustrate the seven criteria at work, we show the extent to which they are satisfied by one of the eight component theories. Next, two sample results of our investigation are presented: satisfaction with one's sex life has the expected covariation with one's quality of life, and so does mother's smoking during pregnancy, albeit to a much smaller extent. It is concluded that the methodological framework presented has proved helpful in designing a questionnaire that is capable of yielding acceptably valid and reliable measurements of global and generic quality of life.

  10. Extreme Sea Conditions in Shallow Water: Estimation based on in-situ measurements

    Science.gov (United States)

    Le Crom, Izan; Saulnier, Jean-Baptiste

    2013-04-01

    The design of marine renewable energy devices and components is based, among others, on the assessment of the environmental extreme conditions (winds, currents, waves, and water level) that must be combined together in order to evaluate the maximal loads on a floating/fixed structure, and on the anchoring system over a determined return period. Measuring devices are generally deployed at sea over relatively short durations (a few months to a few years), typically when describing water free surface elevation, and extrapolation methods based on hindcast data (and therefore on wave simulation models) have to be used. How to combine, in a realistic way, the action of the different loads (winds and waves for instance) and which correlation of return periods should be used are highly topical issues. However, the assessment of the extreme condition itself remains a not-fully-solved, crucial, and sensitive task. Above all in shallow water, extreme wave height, Hmax, is the most significant contribution in the dimensioning process of EMR devices. As a case study, existing methodologies for deep water have been applied to SEMREV, the French marine energy test site. The interest of this study, especially at this location, goes beyond the simple application to SEMREV's WEC and floating wind turbines deployment as it could also be extended to the Banc de Guérande offshore wind farm that are planned close by. More generally to pipes and communication cables as it is a redundant problematic. The paper will first present the existing measurements (wave and wind on site), the prediction chain that has been developed via wave models, the extrapolation methods applied to hindcast data, and will try to formulate recommendations for improving this assessment in shallow water.

  11. Dissociating Contingency Awareness and Conditioned Attitudes: Evidence of Contingency-Unaware Evaluative Conditioning

    Science.gov (United States)

    Hutter, Mandy; Sweldens, Steven; Stahl, Christoph; Unkelbach, Christian; Klauer, Karl Christoph

    2012-01-01

    Whether human evaluative conditioning can occur without contingency awareness has been the subject of an intense and ongoing debate for decades, troubled by a wide array of methodological difficulties. Following recent methodological innovations, the available evidence currently points to the conclusion that evaluative conditioning effects do not…

  12. Methodology to estimate parameters of an excitation system based on experimental conditions

    Energy Technology Data Exchange (ETDEWEB)

    Saavedra-Montes, A.J. [Carrera 80 No 65-223, Bloque M8 oficina 113, Escuela de Mecatronica, Universidad Nacional de Colombia, Medellin (Colombia); Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Ramirez-Scarpetta, J.M. [Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Malik, O.P. [2500 University Drive N.W., Electrical and Computer Engineering Department, University of Calgary, Calgary, Alberta (Canada)

    2011-01-15

    A methodology to estimate the parameters of a potential-source controlled rectifier excitation system model is presented in this paper. The proposed parameter estimation methodology is based on the characteristics of the excitation system. A comparison of two pseudo random binary signals, two sampling periods for each one, and three estimation algorithms is also presented. Simulation results from an excitation control system model and experimental results from an excitation system of a power laboratory setup are obtained. To apply the proposed methodology, the excitation system parameters are identified at two different levels of the generator saturation curve. The results show that it is possible to estimate the parameters of the standard model of an excitation system, recording two signals and the system operating in closed loop with the generator. The normalized sum of squared error obtained with experimental data is below 10%, and with simulation data is below 5%. (author)

  13. Measurements of integrated components' parameters versus irradiation doses gamma radiation (60Co) dosimetry-methodology-tests

    International Nuclear Information System (INIS)

    Fuan, J.

    1991-01-01

    This paper describes the methodology used for the irradiation of the integrated components and the measurements of their parameters, using Quality Insurance of dosimetry: - Measurement of the integrated dose using the competences of the Laboratoire Central des Industries Electriques (LCIE): - Measurement of irradiation dose versus source/component distance, using a calibrated equipment. - Use of ALANINE dosimeters, placed on the support of the irradiated components. - Assembly and polarization of components during the irradiations. Selection of the irradiator. - Measurement of the irradiated components's parameters, using the competences of the societies: - GenRad: GR130 tests equipement placed in the DEIN/SIR-CEN SACLAY. - Laboratoire Central des Industries Electriques (LCIE): GR125 tests equipment and this associated programmes test [fr

  14. Ultrasonic Technique for Density Measurement of Liquids in Extreme Conditions

    Science.gov (United States)

    Kazys, Rymantas; Sliteris, Reimondas; Rekuviene, Regina; Zukauskas, Egidijus; Mazeika, Liudas

    2015-01-01

    An ultrasonic technique, invariant to temperature changes, for a density measurement of different liquids under in situ extreme conditions is presented. The influence of geometry and material parameters of the measurement system (transducer, waveguide, matching layer) on measurement accuracy and reliability is analyzed theoretically along with experimental results. The proposed method is based on measurement of the amplitude of the ultrasonic wave, reflected from the interface of the solid/liquid medium under investigation. In order to enhance sensitivity, the use of a quarter wavelength acoustic matching layer is proposed. Therefore, the sensitivity of the measurement system increases significantly. Density measurements quite often must be performed in extreme conditions at high temperature (up to 220 °C) and high pressure. In this case, metal waveguides between piezoelectric transducer and the measured liquid are used in order to protect the conventional transducer from the influence of high temperature and to avoid depolarization. The presented ultrasonic density measurement technique is suitable for density measurement in different materials, including liquids and polymer melts in extreme conditions. A new calibration algorithm was proposed. The metrological evaluation of the measurement method was performed. The expanded measurement uncertainty Uρ = 7.4 × 10−3 g/cm3 (1%). PMID:26262619

  15. Ultrasonic Technique for Density Measurement of Liquids in Extreme Conditions.

    Science.gov (United States)

    Kazys, Rymantas; Sliteris, Reimondas; Rekuviene, Regina; Zukauskas, Egidijus; Mazeika, Liudas

    2015-08-07

    An ultrasonic technique, invariant to temperature changes, for a density measurement of different liquids under in situ extreme conditions is presented. The influence of geometry and material parameters of the measurement system (transducer, waveguide, matching layer) on measurement accuracy and reliability is analyzed theoretically along with experimental results. The proposed method is based on measurement of the amplitude of the ultrasonic wave, reflected from the interface of the solid/liquid medium under investigation. In order to enhance sensitivity, the use of a quarter wavelength acoustic matching layer is proposed. Therefore, the sensitivity of the measurement system increases significantly. Density measurements quite often must be performed in extreme conditions at high temperature (up to 220 °C) and high pressure. In this case, metal waveguides between piezoelectric transducer and the measured liquid are used in order to protect the conventional transducer from the influence of high temperature and to avoid depolarization. The presented ultrasonic density measurement technique is suitable for density measurement in different materials, including liquids and polymer melts in extreme conditions. A new calibration algorithm was proposed. The metrological evaluation of the measurement method was performed. The expanded measurement uncertainty Uρ = 7.4 × 10(-3) g/cm(3) (1%).

  16. Ultrasonic Technique for Density Measurement of Liquids in Extreme Conditions

    Directory of Open Access Journals (Sweden)

    Rymantas Kazys

    2015-08-01

    Full Text Available An ultrasonic technique, invariant to temperature changes, for a density measurement of different liquids under in situ extreme conditions is presented. The influence of geometry and material parameters of the measurement system (transducer, waveguide, matching layer on measurement accuracy and reliability is analyzed theoretically along with experimental results. The proposed method is based on measurement of the amplitude of the ultrasonic wave, reflected from the interface of the solid/liquid medium under investigation. In order to enhance sensitivity, the use of a quarter wavelength acoustic matching layer is proposed. Therefore, the sensitivity of the measurement system increases significantly. Density measurements quite often must be performed in extreme conditions at high temperature (up to 220 °C and high pressure. In this case, metal waveguides between piezoelectric transducer and the measured liquid are used in order to protect the conventional transducer from the influence of high temperature and to avoid depolarization. The presented ultrasonic density measurement technique is suitable for density measurement in different materials, including liquids and polymer melts in extreme conditions. A new calibration algorithm was proposed. The metrological evaluation of the measurement method was performed. The expanded measurement uncertainty Uρ = 7.4 × 10−3 g/cm3 (1%.

  17. Model development and optimization of operating conditions to maximize PEMFC performance by response surface methodology

    International Nuclear Information System (INIS)

    Kanani, Homayoon; Shams, Mehrzad; Hasheminasab, Mohammadreza; Bozorgnezhad, Ali

    2015-01-01

    Highlights: • The optimization of the operating parameters in a serpentine PEMFC is done using RSM. • The RSM model can predict the cell power over the wide range of operating conditions. • St-An, St-Ca and RH-Ca have an optimum value to obtain the best performance. • The interactions of the operating conditions affect the output power significantly. • The cathode and anode stoichiometry are the most effective parameters on the power. - Abstract: Optimization of operating conditions to obtain maximum power in PEMFCs could have a significant role to reduce the costs of this emerging technology. In the present experimental study, a single serpentine PEMFC is used to investigate the effects of operating conditions on the electrical power production of the cell. Four significant parameters including cathode stoichiometry, anode stoichiometry, gases inlet temperature, and cathode relative humidity are studied using Design of Experiment (DOE) to obtain an optimal power. Central composite second order Response Surface Methodology (RSM) is used to model the relationship between goal function (power) and considered input parameters (operating conditions). Using this statistical–mathematical method leads to obtain a second-order equation for the cell power. This model considers interactions and quadratic effects of different operating conditions and predicts the maximum or minimum power production over the entire working range of the parameters. In this range, high stoichiometry of cathode and low stoichiometry of anode results in the minimum cell power and contrary the medium range of fuel and oxidant stoichiometry leads to the maximum power. Results show that there is an optimum value for the anode stoichiometry, cathode stoichiometry and relative humidity to reach the best performance. The predictions of the model are evaluated by experimental tests and they are in a good agreement for different ranges of the parameters

  18. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    Science.gov (United States)

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Practical methodological guide for hydrometric inter-laboratory organisation

    Science.gov (United States)

    Besson, David; Bertrand, Xavier

    2015-04-01

    Discharge measurements performed by the French governmental hydrometer team feed a national database. This data is available for general river flows knowkedge, flood forecasting, low water survey, statistical calculations flow, control flow regulatory and many other uses. Regularly checking the measurements quality and better quantifying its accuracy is therefore an absolute need. The practice of inter-laboratory comparison in hydrometry particularly developed during the last decade. Indeed, discharge measurement can not easily be linked to a standard. Therefore, on-site measurement accuracy control is very difficult. Inter-laboratory comparison is thus a practical solution to this issue. However, it needs some regulations in order to ease its practice and legitimize its results. To do so, the French government hydrometrics teams produced a practical methodological guide for hydrometric inter-laboratory organisation in destination of hydrometers community in view of ensure the harmonization of inter-laboratory comparison practices for different materials (ADCP, current meter on wadind rod or gauging van, tracer dilution, surface speed) and flow range (flood, low water). Ensure the results formalization and banking. The realisation of this practice guide is grounded on the experience of the governmental teams & their partners (or fellows), following existing approaches (Doppler group especially). The guide is designated to validate compliance measures and identify outliers : Hardware, methodological, environmental, or human. Inter-laboratory comparison provides the means to verify the compliance of the instruments (devices + methods + operators) and provides methods to determine an experimental uncertainty of the tested measurement method which is valid only for the site and the measurement conditions but does not address the calibration or periodic monitoring of the few materials. After some conceptual definitions, the guide describes the different stages of an

  20. OPTIMIZATION OF PRETREATMENT CONDITIONS OF CARROTS TO MAXIMIZE JUICE RECOVERY BY RESPONSE SURFACE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    H. K. SHARMA

    2006-12-01

    Full Text Available Carrot juice was expressed in a hydraulic press using a wooden set up. Carrot samples pretreated at different designed combinations, using Central Composite Rotatable Design (CCRD, Response Surface Methodology (RSM, of pH, temperature and time were expressed and juice so obtained was characterized for various physico-chemical parameters which involved yield, TSS and water content, reducing sugars, total sugars and color (absorbance. The study indicated that carrots exposed to the different pretreatment conditions resulted in increased amount of yield than that of the control. The responses were optimized by numerical method and were found to be 78.23% yield, 0.93% color (abs, 3.41% reducing sugars, 5.53% total sugars, 6.69obrix, and 90.50% water content. All the derived mathematical models for the various responses were found to be fit significantly to predict the data.

  1. Optimizing the conditions for hydrothermal liquefaction of barley straw for bio-crude oil production using response surface methodology

    DEFF Research Database (Denmark)

    Zhu, Zhe; Rosendahl, Lasse Aistrup; Toor, Saqib Sohail

    2018-01-01

    The present paper examines the conversion of barley straw to bio-crude oil (BO) via hydrothermal liquefaction. Response surface methodology based on central composite design was utilized to optimize the conditions of four independent variables including reaction temperature (factor X1, 260-340 oC...... phenols and their derivatives, acids, aromatic hydrocarbon, ketones, N-contained compounds and alcohols, which makes it a promising material in the applications of either bio-fuel or as a phenol substitute in bio-phenolic resins....

  2. Drosophila Courtship Conditioning As a Measure of Learning and Memory.

    Science.gov (United States)

    Koemans, Tom S; Oppitz, Cornelia; Donders, Rogier A T; van Bokhoven, Hans; Schenck, Annette; Keleman, Krystyna; Kramer, Jamie M

    2017-06-05

    Many insights into the molecular mechanisms underlying learning and memory have been elucidated through the use of simple behavioral assays in model organisms such as the fruit fly, Drosophila melanogaster. Drosophila is useful for understanding the basic neurobiology underlying cognitive deficits resulting from mutations in genes associated with human cognitive disorders, such as intellectual disability (ID) and autism. This work describes a methodology for testing learning and memory using a classic paradigm in Drosophila known as courtship conditioning. Male flies court females using a distinct pattern of easily recognizable behaviors. Premated females are not receptive to mating and will reject the male's copulation attempts. In response to this rejection, male flies reduce their courtship behavior. This learned reduction in courtship behavior is measured over time, serving as an indicator of learning and memory. The basic numerical output of this assay is the courtship index (CI), which is defined as the percentage of time that a male spends courting during a 10 min interval. The learning index (LI) is the relative reduction of CI in flies that have been exposed to a premated female compared to naïve flies with no previous social encounters. For the statistical comparison of LIs between genotypes, a randomization test with bootstrapping is used. To illustrate how the assay can be used to address the role of a gene relating to learning and memory, the pan-neuronal knockdown of Dihydroxyacetone phosphate acyltransferase (Dhap-at) was characterized here. The human ortholog of Dhap-at, glyceronephosphate O-acyltransferase (GNPT), is involved in rhizomelic chondrodysplasia punctata type 2, an autosomal-recessive syndrome characterized by severe ID. Using the courtship conditioning assay, it was determined that Dhap-at is required for long-term memory, but not for short-term memory. This result serves as a basis for further investigation of the underlying molecular

  3. Critical infrastructure systems of systems assessment methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  4. Comparing Structural Identification Methodologies for Fatigue Life Prediction of a Highway Bridge

    Directory of Open Access Journals (Sweden)

    Sai G. S. Pai

    2018-01-01

    Full Text Available Accurate measurement-data interpretation leads to increased understanding of structural behavior and enhanced asset-management decision making. In this paper, four data-interpretation methodologies, residual minimization, traditional Bayesian model updating, modified Bayesian model updating (with an L∞-norm-based Gaussian likelihood function, and error-domain model falsification (EDMF, a method that rejects models that have unlikely differences between predictions and measurements, are compared. In the modified Bayesian model updating methodology, a correction is used in the likelihood function to account for the effect of a finite number of measurements on posterior probability–density functions. The application of these data-interpretation methodologies for condition assessment and fatigue life prediction is illustrated on a highway steel–concrete composite bridge having four spans with a total length of 219 m. A detailed 3D finite-element plate and beam model of the bridge and weigh-in-motion data are used to obtain the time–stress response at a fatigue critical location along the bridge span. The time–stress response, presented as a histogram, is compared to measured strain responses either to update prior knowledge of model parameters using residual minimization and Bayesian methodologies or to obtain candidate model instances using the EDMF methodology. It is concluded that the EDMF and modified Bayesian model updating methodologies provide robust prediction of fatigue life compared with residual minimization and traditional Bayesian model updating in the presence of correlated non-Gaussian uncertainty. EDMF has additional advantages due to ease of understanding and applicability for practicing engineers, thus enabling incremental asset-management decision making over long service lives. Finally, parallel implementations of EDMF using grid sampling have lower computations times than implementations using adaptive sampling.

  5. Optimisation of Ultrasound-Assisted Extraction Conditions for Phenolic Content and Antioxidant Capacity from Euphorbia tirucalli Using Response Surface Methodology

    Science.gov (United States)

    Vuong, Quan V.; Goldsmith, Chloe D.; Dang, Trung Thanh; Nguyen, Van Tang; Bhuyan, Deep Jyoti; Sadeqzadeh, Elham; Scarlett, Christopher J.; Bowyer, Michael C.

    2014-01-01

    Euphorbia tirucalli (E. tirucalli) is now widely distributed around the world and is well known as a source of traditional medicine in many countries. This study aimed to utilise response surface methodology (RSM) to optimise ultrasonic-assisted extraction (UAE) conditions for total phenolic compounds (TPC) and antioxidant capacity from E. tirucalli leaf. The results showed that ultrasonic temperature, time and power effected TPC and antioxidant capacity; however, the effects varied. Ultrasonic power had the strongest influence on TPC; whereas ultrasonic temperature had the greatest impact on antioxidant capacity. Ultrasonic time had the least impact on both TPC and antioxidant capacity. The optimum UAE conditions were determined to be 50 °C, 90 min. and 200 W. Under these conditions, the E. tirucalli leaf extract yielded 2.93 mg GAE/g FW of TPC and exhibited potent antioxidant capacity. These conditions can be utilised for further isolation and purification of phenolic compounds from E. tirucalli leaf. PMID:26785074

  6. Exhaled nitric oxide measurements in the first 2 years of life: methodological issues, clinical and epidemiological applications

    Directory of Open Access Journals (Sweden)

    de Benedictis Fernando M

    2009-07-01

    Full Text Available Abstract Fractional exhaled nitric oxide (FeNO is a useful tool to diagnose and monitor eosinophilic bronchial inflammation in asthmatic children and adults. In children younger than 2 years of age FeNO has been successfully measured both with the tidal breathing and with the single breath techniques. However, there are a number of methodological issues that need to be addressed in order to increase the reproducibility of the FeNO measurements within and between infants. Indeed, a standardized method to measure FeNO in the first 2 years of life would be extremely useful in order to meaningfully interpret FeNO values in this age group. Several factors related to the measurement conditions have been found to influence FeNO, such as expiratory flow, ambient NO and nasal contamination. Furthermore, the exposure to pre- and postnatal risk factors for respiratory morbidity has been shown to influence FeNO values. Therefore, these factors should always be assessed and their association with FeNO values in the specific study population should be evaluated and, eventually, controlled for. There is evidence consistently suggesting that FeNO is increased in infants with family history of atopy/atopic diseases and in infants with recurrent wheezing. These findings could support the hypothesis that eosinophilic bronchial inflammation is present at an early stage in those infants at increased risk of developing persistent respiratory symptoms and asthma. Furthermore, it has been shown that FeNO measurements could represent a useful tool to assess bronchial inflammation in other airways diseases, such as primary ciliary dyskinesia, bronchopulmonary dysplasia and cystic fibrosis. Further studies are needed in order to improve the reproducibility of the measurements, and large prospective studies are warranted in order to evaluate whether FeNO values measured in the first years of life can predict the future development of asthma or other respiratory diseases.

  7. A study of calculation methodology and experimental measurements of the kinetic parameters for source driven subcritical systems

    International Nuclear Information System (INIS)

    Lee, Seung Min

    2009-01-01

    This work presents a theoretical study of reactor kinetics focusing on the methodology of calculation and the experimental measurements of the so-called kinetic parameters. A comparison between the methodology based on the Dulla's formalism and the classical method is made. The objective is to exhibit the dependence of the parameters on subcriticality level and perturbation. Two different slab type systems were considered: thermal one and fast one, both with homogeneous media. One group diffusion model was used for the fast reactor, and for the thermal system, two groups diffusion model, considering, in both case, only one precursor's family. The solutions were obtained using the expansion method. Also, descriptions of the main experimental methods of measurements of the kinetic parameters are presented in order to put a question about the compatibility of these methods in subcritical region. (author)

  8. Aircraft and ground vehicle friction measurements obtained under winter runway conditions

    Science.gov (United States)

    Yager, Thomas J.

    1989-01-01

    Tests with specially instrumented NASA B-737 and B-727 aircraft together with several different ground friction measuring devices have been conducted for a variety of runway surface types and wetness conditions. This effort is part of the Joint FAA/NASA Aircraft/Ground Vehicle Runway Friction Program aimed at obtaining a better understanding of aircraft ground handling performance under adverse weather conditions, and defining relationships between aircraft and ground vehicle tire friction measurements. Aircraft braking performance on dry, wet, snow-, and ice-covered runway conditions is discussed together with ground vehicle friction data obtained under similar runway conditions. For the wet, compacted snow- and ice-covered runway conditions, the relationship between ground vehicles and aircraft friction data is identified. The influence of major test parameters on friction measurements such as speed, test tire characteristics, and surface contaminant-type are discussed. The test results indicate that use of properly maintained and calibrated ground vehicles for monitoring runway friction conditions should be encouraged particularly under adverse weather conditions.

  9. Classification of heart valve condition using acoustic measurements

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Prosthetic heart valves and the many great strides in valve design have been responsible for extending the life spans of many people with serious heart conditions. Even though the prosthetic valves are extremely reliable, they are eventually susceptible to long-term fatigue and structural failure effects expected from mechanical devices operating over long periods of time. The purpose of our work is to classify the condition of in vivo Bjork-Shiley Convexo-Concave (BSCC) heart valves by processing acoustic measurements of heart valve sounds. The structural failures of interest for Bscc valves is called single leg separation (SLS). SLS can occur if the outlet strut cracks and separates from the main structure of the valve. We measure acoustic opening and closing sounds (waveforms) using high sensitivity contact microphones on the patient`s thorax. For our analysis, we focus our processing and classification efforts on the opening sounds because they yield direct information about outlet strut condition with minimal distortion caused by energy radiated from the valve disc.

  10. A Methodology to Measure Synergy Among Energy-Efficiency Programs at the Program Participant Level

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E.

    2003-11-14

    This paper presents a methodology designed to measure synergy among energy-efficiency programs at the program participant level (e.g., households, firms). Three different definitions of synergy are provided: strong, moderate, and weak. Data to measure synergy can be collected through simple survey questions. Straightforward mathematical techniques can be used to estimate the three types of synergy and explore relative synergistic impacts of different subsets of programs. Empirical research is needed to test the concepts and methods and to establish quantitative expectations about synergistic relationships among programs. The market for new energy-efficient motors is the context used to illustrate all the concepts and methods in this paper.

  11. Epithelial cell proliferative activity of Barrett's esophagus : methodology and correlation with traditional cancer risk markers

    NARCIS (Netherlands)

    Peters, FTM; Ganesh, S; Kuipers, EJ; De Jager-Krikken, A; Karrenbeld, A; Harms, Geert; Sluiter, WJ; Koudstaal, J; Klinkenberg-Knol, EC; Lamers, CBHW; Kleibeuker, JH

    Barrett's esophagus (BE) is a premalignant condition, due to chronic gastroesophageal reflux. Effective antireflux therapy may diminish cancer risk. To evaluate this option an intermediate marker is needed. We developed a methodology for measurement of epithelial cell proliferative activity of

  12. Current psychometric and methodological issues in the measurement of overgeneral autobiographical memory.

    Science.gov (United States)

    Griffith, James W; Sumner, Jennifer A; Raes, Filip; Barnhofer, Thorsten; Debeer, Elise; Hermans, Dirk

    2012-12-01

    Autobiographical memory is a multifaceted construct that is related to psychopathology and other difficulties in functioning. Across many studies, a variety of methods have been used to study autobiographical memory. The relationship between overgeneral autobiographical memory (OGM) and psychopathology has been of particular interest, and many studies of this cognitive phenomenon rely on the Autobiographical Memory Test (AMT) to assess it. In this paper, we examine several methodological approaches to studying autobiographical memory, and focus primarily on methodological and psychometric considerations in OGM research. We pay particular attention to what is known about the reliability, validity, and methodological variations of the AMT. The AMT has adequate psychometric properties, but there is great variability in methodology across studies that use it. Methodological recommendations and suggestions for future studies are presented. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Methodology for setup and data processing of mobile air quality measurements to assess the spatial variability of concentrations in urban environments

    International Nuclear Information System (INIS)

    Van Poppel, Martine; Peters, Jan; Bleux, Nico

    2013-01-01

    A case study is presented to illustrate a methodology for mobile monitoring in urban environments. A dataset of UFP, PM 2.5 and BC concentrations was collected. We showed that repeated mobile measurements could give insight in spatial variability of pollutants at different micro-environments in a city. Streets of contrasting traffic intensity showed increased concentrations by a factor 2–3 for UFP and BC and by 2.5 . The first quartile (P25) of the mobile measurements at an urban background zone seems to be good estimate of the urban background concentration. The local component of the pollutant concentrations was determined by background correction. The use of background correction reduced the number of runs needed to obtain representative results. The results presented, are a first attempt to establish a methodology for setup and data processing of mobile air quality measurements to assess the spatial variability of concentrations in urban environments. -- Highlights: ► Mobile measurements are used to assess the variability of air pollutants in urban environments. ► PM 2.5 , BC and UFP concentrations are presented for zones with different traffic characteristics. ► A methodology for background correction based on the mobile measurements is presented. ► The background concentration is estimated as the 25th percentile of the urban background data. ► The minimum numbers of runs for a representative estimate is reduced after background correction. -- This paper shows that the spatial variability of air pollutants in an urban environment can be assessed by a mobile monitoring methodology including background correction

  14. Thermophysical Properties Measurement of High-Temperature Liquids Under Microgravity Conditions in Controlled Atmospheric Conditions

    Science.gov (United States)

    Watanabe, Masahito; Ozawa, Shumpei; Mizuno, Akotoshi; Hibiya, Taketoshi; Kawauchi, Hiroya; Murai, Kentaro; Takahashi, Suguru

    2012-01-01

    Microgravity conditions have advantages of measurement of surface tension and viscosity of metallic liquids by the oscillating drop method with an electromagnetic levitation (EML) device. Thus, we are preparing the experiments of thermophysical properties measurements using the Materials-Science Laboratories ElectroMagnetic-Levitator (MSL-EML) facilities in the international Space station (ISS). Recently, it has been identified that dependence of surface tension on oxygen partial pressure (Po2) must be considered for industrial application of surface tension values. Effect of Po2 on surface tension would apparently change viscosity from the damping oscillation model. Therefore, surface tension and viscosity must be measured simultaneously in the same atmospheric conditions. Moreover, effect of the electromagnetic force (EMF) on the surface oscillations must be clarified to obtain the ideal surface oscillation because the EMF works as the external force on the oscillating liquid droplets, so extensive EMF makes apparently the viscosity values large. In our group, using the parabolic flight levitation experimental facilities (PFLEX) the effect of Po2 and external EMF on surface oscillation of levitated liquid droplets was systematically investigated for the precise measurements of surface tension and viscosity of high temperature liquids for future ISS experiments. We performed the observation of surface oscillations of levitated liquid alloys using PFLEX on board flight experiments by Gulfstream II (G-II) airplane operated by DAS. These observations were performed under the controlled Po2 and also under the suitable EMF conditions. In these experiments, we obtained the density, the viscosity and the surface tension values of liquid Cu. From these results, we discuss about as same as reported data, and also obtained the difference of surface oscillations with the change of the EMF conditions.

  15. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist

    NARCIS (Netherlands)

    Terwee, C.B.; Mokkink, L.B.; Knol, D.L.; Ostelo, R.W.J.G.; Bouter, L.M.; de Vet, H.C.W.

    2012-01-01

    Background: The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a

  16. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  17. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  18. Diesel ignition delay and lift-off length through different methodologies using a multi-hole injector

    International Nuclear Information System (INIS)

    Payri, Raúl; Salvador, F.J.; Manin, Julien; Viera, Alberto

    2016-01-01

    Highlights: • Lift-off length and ignition delay are measured through different methodologies. • Oxygen concentration, temperature and injection pressure sweeps are performed. • A multi hole injector is compared with an equivalent single hole injector. • Multi hole injector has shorter ignition delay and lift-off length than single hole. • Empirical correlations were calculated for an analytical description of the results. - Abstract: In this paper, lift-off length has been measured via both broadband luminosity and OH chemiluminescence. In addition, ignition delay has also been measured via broadband chemiluminescence and Schlieren imaging. A 3 orifice injector from the Engine Combustion Network (ECN) set, referred to as Spray B, and a single component fuel (n-dodecane) was used. Experiments were carried out in a constant flow and pressure facility, that allowed to reproduce engine-like thermodynamic conditions, and enabled the study to be performed over a wide range of test conditions with a very high repetition rate. Data obtained was also compared with results from a single orifice injector also from the Engine Combustion Network, with analog orifice characteristics (90 μm outlet diameter and convergent shape) and technology as the injector used. Results showed that there is good correlation between the ignition delay measured through both methodologies, that oxygen concentration and injection pressure plays a minor role in the ignition delay, being ambient temperature and density the parameters with the highest influence. Lift-off length measurements showed significant differences between methodologies. Minor deviation was observed between injectors with different nozzle geometry (seat inclination angle), due to temperature variations along the chamber, highlighting the importance of temperature distribution along combustion vessels. Empirical correlations for lift-off and ignition delay were calculated, underlining the effect of the conditions on

  19. Verification of dosimetric methodology for auditing radiotherapy quality under non-reference condition in Hubei province

    International Nuclear Information System (INIS)

    Ma Xinxing; Luo Suming; He Zhijian; Zhou Wenshan

    2014-01-01

    Objective: To verify the reliability of TLD-based quality audit for radiotherapy dosimetry of medical electron accelerator in non-reference condition by monitoring the dose variations from electron beams with different field sizes and 45° wedge and the dose variations from photon beams with different field sizes and source-skin distance. Methods: Both TLDs and finger ionization chambers were placed at a depth of 10 cm in water to measure the absorbed dose from photon beams, and also placed at the depth of maximum dose from electron beams under non-reference condition. TLDs were then mailed to National Institute for Radiological Protection, China CDC for further measurement. Results: Among the 70 measuring points for photon beams, 58 points showed the results with a relative error less than ±7.0% (IAEA's acceptable deviation: ±7.0%) between TLDs and finger ionization chambers measurements, and the percentage of qualified point numbers was 82.8%. After corrected by Ps value, 62 points were qualified and the percentage was up to 88.6%. All of the measuring points for electron beams, with the total number of 24, presented a relative error within ±5.0% (IAEA's acceptable deviation: ±5.0%) between TLDs and finger ioization cylindrical chambers measurements. Conclusions: TLD-based quality audit is convenient for determining radiotherapy dosimetric parameters of electron beams in non-reference condition and can improve the accuracy of the measuring parameters in connection with finger chambers. For electron beams of 5 MeV < E_0 < 10 MeV, the absorbed dose parameters measured by finger ionization chambers, combined with TLD audit, can help obtain the precise and reliable results. (authors)

  20. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    P. Arulmathi

    2015-01-01

    Full Text Available Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD. The results showed that electrochemical treatment process effectively removed the COD (89.5% and color (95.1% of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm2, electrolysis time of 103.27 min, and electrolyte (NaCl concentration of 1.67 g/L, respectively.

  1. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  2. Information about Musculoskeletal Conditions

    Science.gov (United States)

    ... Practice Guidelines Clinical Practice Guidelines OrthoGuidelines App Clinical Practice Guideline Methodology Appropriate Use Criteria Appropriate Use Criteria OrthoGuidelines App AUC Methodology Performance Measures Clinical Performance Measures Measure Development Patient Reported Outcome Measures ...

  3. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  4. Quadratic measurement and conditional state preparation in an optomechanical system

    DEFF Research Database (Denmark)

    A. Brawley, George; Vanner, Michael A.; Bowen, Warwick P.

    2014-01-01

    We experimentally demonstrate, for the first time, quadratic measurement of mechanical motion in an optomechanical system. We use this nonlinear easurement to conditionally prepare classical non-Gaussian states of motion of a micro-mechanical oscillator.......We experimentally demonstrate, for the first time, quadratic measurement of mechanical motion in an optomechanical system. We use this nonlinear easurement to conditionally prepare classical non-Gaussian states of motion of a micro-mechanical oscillator....

  5. Setting the light conditions for measuring root transparency for age-at-death estimation methods.

    Science.gov (United States)

    Adserias-Garriga, Joe; Nogué-Navarro, Laia; Zapico, Sara C; Ubelaker, Douglas H

    2018-03-01

    Age-at-death estimation is one of the main goals in forensic identification, being an essential parameter to determine the biological profile, narrowing the possibility of identification in cases involving missing persons and unidentified bodies. The study of dental tissues has been long considered as a proper tool for age estimation with several age estimation methods based on them. Dental age estimation methods can be divided into three categories: tooth formation and development, post-formation changes, and histological changes. While tooth formation and growth changes are important for fetal and infant consideration, when the end of dental and skeletal growth is achieved, post-formation or biochemical changes can be applied. Lamendin et al. in J Forensic Sci 37:1373-1379, (1992) developed an adult age estimation method based on root transparency and periodontal recession. The regression formula demonstrated its accuracy of use for 40 to 70-year-old individuals. Later on, Prince and Ubelaker in J Forensic Sci 47(1):107-116, (2002) evaluated the effects of ancestry and sex and incorporated root height into the equation, developing four new regression formulas for males and females of African and European ancestry. Even though root transparency is a key element in the method, the conditions for measuring this element have not been established. The aim of the present study is to set the light conditions measured in lumens that offer greater accuracy when applying the Lamendin et al. method modified by Prince and Ubelaker. The results must be also taken into account in the application of other age estimation methodologies using root transparency to estimate age-at-death.

  6. Methodology for the analysis of pollutant emissions from a city bus

    International Nuclear Information System (INIS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-01-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel–air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least. (paper)

  7. Methodology for the analysis of pollutant emissions from a city bus

    Science.gov (United States)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  8. Methodology of heat transfer and flow resistance measurement for matrices of rotating regenerative heat exchangers

    Directory of Open Access Journals (Sweden)

    Butrymowicz Dariusz

    2016-09-01

    Full Text Available The theoretical basis for the indirect measurement approach of mean heat transfer coefficient for the packed bed based on the modified single blow technique was presented and discussed in the paper. The methodology of this measurement approach dedicated to the matrix of the rotating regenerative gas heater was discussed in detail. The testing stand consisted of a dedicated experimental tunnel with auxiliary equipment and a measurement system are presented. Selected experimental results are presented and discussed for selected types of matrices of regenerative air preheaters for the wide range of Reynolds number of gas. The agreement between the theoretically predicted and measured temperature profiles was demonstrated. The exemplary dimensionless relationships between Colburn heat transfer factor, Darcy flow resistance factor and Reynolds number were presented for the investigated matrices of the regenerative gas heater.

  9. Development of plant condition measurement - The Jimah Model

    Science.gov (United States)

    Evans, Roy F.; Syuhaimi, Mohd; Mazli, Mohammad; Kamarudin, Nurliyana; Maniza Othman, Faiz

    2012-05-01

    The Jimah Model is an information management model. The model has been designed to facilitate analysis of machine condition by integrating diagnostic data with quantitative and qualitative information. The model treats data as a single strand of information - metaphorically a 'genome' of data. The 'Genome' is structured to be representative of plant function and identifies the condition of selected components (or genes) in each machine. To date in industry, computer aided work processes used with traditional industrial practices, have been unable to consistently deliver a standard of information suitable for holistic evaluation of machine condition and change. Significantly the reengineered site strategies necessary for implementation of this "data genome concept" have resulted in enhanced knowledge and management of plant condition. In large plant with high initial equipment cost and subsequent high maintenance costs, accurate measurement of major component condition becomes central to whole of life management and replacement decisions. A case study following implementation of the model at a major power station site in Malaysia (Jimah) shows that modeling of plant condition and wear (in real time) can be made a practical reality.

  10. Development of plant condition measurement - The Jimah Model

    International Nuclear Information System (INIS)

    Evans, Roy F; Syuhaimi, Mohd; Mazli, Mohammad; Kamarudin, Nurliyana; Othman, Faiz Maniza

    2012-01-01

    The Jimah Model is an information management model. The model has been designed to facilitate analysis of machine condition by integrating diagnostic data with quantitative and qualitative information. The model treats data as a single strand of information - metaphorically a 'genome' of data. The 'Genome' is structured to be representative of plant function and identifies the condition of selected components (or genes) in each machine. To date in industry, computer aided work processes used with traditional industrial practices, have been unable to consistently deliver a standard of information suitable for holistic evaluation of machine condition and change. Significantly the reengineered site strategies necessary for implementation of this 'data genome concept' have resulted in enhanced knowledge and management of plant condition. In large plant with high initial equipment cost and subsequent high maintenance costs, accurate measurement of major component condition becomes central to whole of life management and replacement decisions. A case study following implementation of the model at a major power station site in Malaysia (Jimah) shows that modeling of plant condition and wear (in real time) can be made a practical reality.

  11. Adaptability of laser diffraction measurement technique in soil physics methodology

    Science.gov (United States)

    Barna, Gyöngyi; Szabó, József; Rajkai, Kálmán; Bakacsi, Zsófia; Koós, Sándor; László, Péter; Hauk, Gabriella; Makó, András

    2016-04-01

    There are intentions all around the world to harmonize soils' particle size distribution (PSD) data by the laser diffractometer measurements (LDM) to that of the sedimentation techniques (pipette or hydrometer methods). Unfortunately, up to the applied methodology (e. g. type of pre-treatments, kind of dispersant etc.), PSDs of the sedimentation methods (due to different standards) are dissimilar and could be hardly harmonized with each other, as well. A need was arisen therefore to build up a database, containing PSD values measured by the pipette method according to the Hungarian standard (MSZ-08. 0205: 1978) and the LDM according to a widespread and widely used procedure. In our current publication the first results of statistical analysis of the new and growing PSD database are presented: 204 soil samples measured with pipette method and LDM (Malvern Mastersizer 2000, HydroG dispersion unit) were compared. Applying usual size limits at the LDM, clay fraction was highly under- and silt fraction was overestimated compared to the pipette method. Subsequently soil texture classes determined from the LDM measurements significantly differ from results of the pipette method. According to previous surveys and relating to each other the two dataset to optimizing, the clay/silt boundary at LDM was changed. Comparing the results of PSDs by pipette method to that of the LDM, in case of clay and silt fractions the modified size limits gave higher similarities. Extension of upper size limit of clay fraction from 0.002 to 0.0066 mm, and so change the lower size limit of silt fractions causes more easy comparability of pipette method and LDM. Higher correlations were found between clay content and water vapor adsorption, specific surface area in case of modified limit, as well. Texture classes were also found less dissimilar. The difference between the results of the two kind of PSD measurement methods could be further reduced knowing other routinely analyzed soil parameters

  12. Measurements of mixtures with carbon dioxide under supercritical conditions using commercial high pressure equipment

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, Luciana L.P.R. de; Rutledge, Luis Augusto Medeiros; Moreno, Eesteban L.; Hovell, Ian; Rajagopal, Krishnaswamy [Universidade Federal do Rio de Janeiro (LATCA-EQ-UFRJ), RJ (Brazil). Escola de Quimica. Lab. de Termodinamica e Cinetica Aplicada

    2012-07-01

    There is a growing interest in studying physical properties of binary and multicomponent fluid mixtures with supercritical carbon dioxide (CO{sub 2}) over an extended range of temperature and pressure. The estimation of properties such as density, viscosity, saturation pressure, compressibility, solubility and surface tension of mixtures is important in design, operation and control as well as optimization of chemical processes especially in extractions, separations, catalytic and enzymatic reactions. The phase behaviour of binary and multicomponent mixtures with supercritical CO{sub 2} is also important in the production and refining of petroleum where mixtures of paraffin, naphthene and aromatics with supercritical fluids are often encountered. Petroleum fluids can present a complex phase behaviour in the presence of CO{sub 2}, where two-phase (VLE and LLE) and three phase regions (VLLE) might occur within ranges of supercritical conditions of temperature and pressure. The objective of this study is to develop an experimental methodology for measuring the phase behaviour of mixtures containing CO{sub 2} in supercritical regions, using commercial high-pressure equipment. (author)

  13. Preliminary comparative assessment of PM10 hourly measurement results from new monitoring stations type using stochastic and exploratory methodology and models

    Science.gov (United States)

    Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł

    2018-01-01

    The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.

  14. Standardization of test conditions for gamma camera performance measurement

    International Nuclear Information System (INIS)

    Jordan, K.

    1980-01-01

    The actual way of measuring gamma camera performance is to use point sources or flood sources in air, often in combination with bar phantoms. This method mostly brings best performance parameters for cameras but it has nothing in common with the use of a camera in clinical practice. Particular in the case of low energy emitters, like Tc-99m, the influence of scattered radiation over the performance of cameras is very high. Therefore it is important to have test conditions of radionuclide imaging devices, that will approach as best as practicable the measuring conditions in clinical applications. It is therefore a good news that the International Electrochemical Commission IEC has prepared a draft 'Characteristics and test conditions of radionuclide imaging devices' which is now submitted to the national committees for formal approval under the Six Months' Rule. Some essential points of this document are discussed in the paper. (orig.) [de

  15. Validation of the PROMIS® measures of self-efficacy for managing chronic conditions.

    Science.gov (United States)

    Gruber-Baldini, Ann L; Velozo, Craig; Romero, Sergio; Shulman, Lisa M

    2017-07-01

    The Patient-Reported Outcomes Measurement Information System ® (PROMIS ® ) was designed to develop, validate, and standardize item banks to measure key domains of physical, mental, and social health in chronic conditions. This paper reports the calibration and validation testing of the PROMIS Self-Efficacy for Managing Chronic Conditions measures. PROMIS Self-Efficacy for Managing Chronic Conditions item banks comprise five domains, Self-Efficacy for Managing: Daily Activities, Symptoms, Medications and Treatments, Emotions, and Social Interactions. Banks were calibrated in 1087 subjects from two data sources: 837 patients with chronic neurologic conditions (epilepsy, multiple sclerosis, neuropathy, Parkinson disease, and stroke) and 250 subjects from an online Internet sample of adults with general chronic conditions. Scores were compared with one legacy scale: Self-Efficacy for Managing Chronic Disease 6-Item scale (SEMCD6) and five PROMIS short forms: Global Health (Physical and Mental), Physical Function, Fatigue, Depression, and Anxiety. The sample was 57% female, mean age = 53.8 (SD = 14.7), 76% white, 21% African American, 6% Hispanic, and 76% with greater than high school education. Full-item banks were created for each domain. All measures had good internal consistency and correlated well with SEMCD6 (r  = 0.56-0.75). Significant correlations were seen between the Self-Efficacy measures and other PROMIS short forms (r  > 0.38). The newly developed PROMIS Self-Efficacy for Managing Chronic Conditions measures include five domains of self-efficacy that were calibrated across diverse chronic conditions and show good internal consistency and cross-sectional validity.

  16. A general centroid determination methodology, with application to multilayer dielectric structures and thermally stimulated current measurements

    International Nuclear Information System (INIS)

    Miller, S.L.; Fleetwood, D.M.; McWhorter, P.J.; Reber, R.A. Jr.; Murray, J.R.

    1993-01-01

    A general methodology is developed to experimentally characterize the spatial distribution of occupied traps in dielectric films on a semiconductor. The effects of parasitics such as leakage, charge transport through more than one interface, and interface trap charge are quantitatively addressed. Charge transport with contributions from multiple charge species is rigorously treated. The methodology is independent of the charge transport mechanism(s), and is directly applicable to multilayer dielectric structures. The centroid capacitance, rather than the centroid itself, is introduced as the fundamental quantity that permits the generic analysis of multilayer structures. In particular, the form of many equations describing stacked dielectric structures becomes independent of the number of layers comprising the stack if they are expressed in terms of the centroid capacitance and/or the flatband voltage. The experimental methodology is illustrated with an application using thermally stimulated current (TSC) measurements. The centroid of changes (via thermal emission) in the amount of trapped charge was determined for two different samples of a triple-layer dielectric structure. A direct consequence of the TSC analyses is the rigorous proof that changes in interface trap charge can contribute, though typically not significantly, to thermally stimulated current

  17. A gamma heating calculation methodology for research reactor application

    International Nuclear Information System (INIS)

    Lee, Y.K.; David, J.C.; Carcreff, H.

    2001-01-01

    Gamma heating is an important issue in research reactor operation and fuel safety. Heat deposition in irradiation targets and temperature distribution in irradiation facility should be determined so as to obtain the optimal irradiation conditions. This paper presents a recently developed gamma heating calculation methodology and its application on the research reactors. Based on the TRIPOLI-4 Monte Carlo code under the continuous-energy option, this new calculation methodology was validated against calorimetric measurements realized within a large ex-core irradiation facility of the 70 MWth OSIRIS materials testing reactor (MTR). The contributions from prompt fission neutrons, prompt fission γ-rays, capture γ-rays and inelastic γ-rays to heat deposition were evaluated by a coupled (n, γ) transport calculation. The fission product decay γ-rays were also considered but the activation γ-rays were neglected in this study. (author)

  18. Conditional Standard Errors of Measurement for Scale Scores.

    Science.gov (United States)

    Kolen, Michael J.; And Others

    1992-01-01

    A procedure is described for estimating the reliability and conditional standard errors of measurement of scale scores incorporating the discrete transformation of raw scores to scale scores. The method is illustrated using a strong true score model, and practical applications are described. (SLD)

  19. LDA measurements under plasma conditions

    International Nuclear Information System (INIS)

    Lesinski, J.; Mizera-Lesinska, B.; Fanton, J.C.; Boulos, M.I.

    1979-01-01

    A study was made of the application of Laser Doppler Anemometry (LDA) for the measurement of the fluid and particle velocities under plasma conditions. The flow configuration, is that of a dc plasma jet called the principal jet, in which an alumina powder of a mean particle diameter of 115 μm and a standard deviation of 11.3 μm was injected using a secondary jet. The plasma jet immerged from a 7.1 mm ID nozzle while that of the secondary jet was 2 nm in diameter. The secondary jet was introduced at the nozzle level of the plasma jet directed 90 0 to its axis. Details of the nozzle and the gas flow system are shown in Figure 2

  20. A methodology for supporting decisions on the establishment of protective measures after severe nuclear accidents

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Kollas, J.G.

    1994-06-01

    The objective of this report is to demonstrate the use of a methology supporting decisions on protective measures following severe nuclear accidents. A multicriteria decision analysis approach is recommended where value tradeoffs are postponed until the very last stage of the decision process. Use of efficient frontiers is made to exclude all technically inferior solutions and present the decision maker with all nondominated solutions. A choice among these solutions implies a value trade-off among the multiple criteria. An interactive computer packge has been developed where the decision maker can choose a point on the efficient frontier in the consequence space and immediately see the alternative in the decision space resulting in the chosen consequences. The methodology is demonstrated through an application on the choice among possible protective measures in contaminated areas of the former USSR after the Chernobyl accident. Two distinct cases are considered: First a decision is to be made only on the basis of the level of soil contamination with Cs-137 and the total cost of the chosen protective policy; Next the decision is based on the geographic dimension of the contamination ant the total cost. Three alternative countermeasure actions are considered for population segments living on soil contaminated at a certain level or in a specific geographic region: (a) relocation of the population; (b) improvement of the living conditions; and, (c) no countermeasures at all. This is final deliverable of the CEC-CIS Joint Study Project 2, Task 5: Decision-Aiding-System for Establishing Intervention Levels, performed under Contracts COSU-CT91-0007 and COSU-CT92-0021 with the Commission of European Communities through CEPN

  1. Three-dimensional RAMA fluence methodology benchmarking

    International Nuclear Information System (INIS)

    Baker, S. P.; Carter, R. G.; Watkins, K. E.; Jones, D. B.

    2004-01-01

    This paper describes the benchmarking of the RAMA Fluence Methodology software, that has been performed in accordance with U. S. Nuclear Regulatory Commission Regulatory Guide 1.190. The RAMA Fluence Methodology has been developed by TransWare Enterprises Inc. through funding provided by the Electric Power Research Inst., Inc. (EPRI) and the Boiling Water Reactor Vessel and Internals Project (BWRVIP). The purpose of the software is to provide an accurate method for calculating neutron fluence in BWR pressure vessels and internal components. The Methodology incorporates a three-dimensional deterministic transport solution with flexible arbitrary geometry representation of reactor system components, previously available only with Monte Carlo solution techniques. Benchmarking was performed on measurements obtained from three standard benchmark problems which include the Pool Criticality Assembly (PCA), VENUS-3, and H. B. Robinson Unit 2 benchmarks, and on flux wire measurements obtained from two BWR nuclear plants. The calculated to measured (C/M) ratios range from 0.93 to 1.04 demonstrating the accuracy of the RAMA Fluence Methodology in predicting neutron flux, fluence, and dosimetry activation. (authors)

  2. Prediction of work metabolism from heart rate measurements in forest work: some practical methodological issues.

    Science.gov (United States)

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Auger, Isabelle; Leone, Mario

    2015-01-01

    Individual heart rate (HR) to workload relationships were determined using 93 submaximal step-tests administered to 26 healthy participants attending physical activities in a university training centre (laboratory study) and 41 experienced forest workers (field study). Predicted maximum aerobic capacity (MAC) was compared to measured MAC from a maximal treadmill test (laboratory study) to test the effect of two age-predicted maximum HR Equations (220-age and 207-0.7 × age) and two clothing insulation levels (0.4 and 0.91 clo) during the step-test. Work metabolism (WM) estimated from forest work HR was compared against concurrent work V̇O2 measurements while taking into account the HR thermal component. Results show that MAC and WM can be accurately predicted from work HR measurements and simple regression models developed in this study (1% group mean prediction bias and up to 25% expected prediction bias for a single individual). Clothing insulation had no impact on predicted MAC nor age-predicted maximum HR equations. Practitioner summary: This study sheds light on four practical methodological issues faced by practitioners regarding the use of HR methodology to assess WM in actual work environments. More specifically, the effect of wearing work clothes and the use of two different maximum HR prediction equations on the ability of a submaximal step-test to assess MAC are examined, as well as the accuracy of using an individual's step-test HR to workload relationship to predict WM from HR data collected during actual work in the presence of thermal stress.

  3. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  4. Optimization of fermentation conditions for 1,3-propanediol production by marine Klebsiella pneumonia HSL4 using response surface methodology

    Science.gov (United States)

    Li, Lili; Zhou, Sheng; Ji, Huasong; Gao, Ren; Qin, Qiwei

    2014-09-01

    The industrially important organic compound 1,3-propanediol (1,3-PDO) is mainly used as a building block for the production of various polymers. In the present study, response surface methodology protocol was followed to determine and optimize fermentation conditions for the maximum production of 1,3-PDO using marine-derived Klebsiella pneumoniae HSL4. Four nutritional supplements together with three independent culture conditions were optimized as follows: 29.3 g/L glycerol, 8.0 g/L K2 HPO4, 7.6 g/L (NH4)2 SO4, 3.0 g/L KH2 PO4, pH 7.1, cultivation at 35°C for 12 h. Under the optimal conditions, a maximum 1,3-PDO concentration of 14.5 g/L, a productivity of 1.21 g/(L·h) and a conversion of glycerol of 0.49 g/g were obtained. In comparison with the control conditions, fermentation under the optimized conditions achieved an increase of 38.8% in 1,3-PDO concentration, 39.0% in productivity and 25.7% in glycerol conversion in flask. This enhancement trend was further confirmed when the fermentation was conducted in a 5-L fermentor. The optimized fermentation conditions could be an important basis for developing lowcost, large-scale methods for industrial production of 1,3-PDO in the future.

  5. Comparing Classic and Interval Analytical Hierarchy Process Methodologies for Measuring Area-Level Deprivation to Analyze Health Inequalities.

    Science.gov (United States)

    Cabrera-Barona, Pablo; Ghorbanzadeh, Omid

    2018-01-16

    Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas.

  6. Concurrent measurement of "real-world" stress and arousal in individuals with psychosis: assessing the feasibility and validity of a novel methodology.

    Science.gov (United States)

    Kimhy, David; Delespaul, Philippe; Ahn, Hongshik; Cai, Shengnan; Shikhman, Marina; Lieberman, Jeffrey A; Malaspina, Dolores; Sloan, Richard P

    2010-11-01

    Psychosis has been repeatedly suggested to be affected by increases in stress and arousal. However, there is a dearth of evidence supporting the temporal link between stress, arousal, and psychosis during "real-world" functioning. This paucity of evidence may stem from limitations of current research methodologies. Our aim is to the test the feasibility and validity of a novel methodology designed to measure concurrent stress and arousal in individuals with psychosis during "real-world" daily functioning. Twenty patients with psychosis completed a 36-hour ambulatory assessment of stress and arousal. We used experience sampling method with palm computers to assess stress (10 times per day, 10 AM → 10 PM) along with concurrent ambulatory measurement of cardiac autonomic regulation using a Holter monitor. The clocks of the palm computer and Holter monitor were synchronized, allowing the temporal linking of the stress and arousal data. We used power spectral analysis to determine the parasympathetic contributions to autonomic regulation and sympathovagal balance during 5 minutes before and after each experience sample. Patients completed 79% of the experience samples (75% with a valid concurrent arousal data). Momentary increases in stress had inverse correlation with concurrent parasympathetic activity (ρ = -.27, P feasibility and validity of our methodology in individuals with psychosis. The methodology offers a novel way to study in high time resolution the concurrent, "real-world" interactions between stress, arousal, and psychosis. The authors discuss the methodology's potential applications and future research directions.

  7. A new method to assess Pavlovian conditioning of psychostimulant drug effects.

    Science.gov (United States)

    Damianopoulos, E N; Carey, R J

    1994-07-01

    Experimental studies of psychoactive drugs by pavlovian drug-conditioning methods, which originally began with investigations of drug-induced responses mediated by the autonomic nervous system, have now been expanded to include drug-induced response effects expressed as modulations of spontaneous motoric behaviors. In the latter application, however, equivalent behavioral response outcomes in post-treatment tests for conditioning can occur following a psychostimulant drug treatment either through drug interference effects on habituation processes, drug-induced stress effects and/or by pavlovian conditioning of the drug-induced motoric activation effect. Current methodologies for the study of pavlovian conditioned drug effects and/or drug sensitization cannot distinguish among these possibilities. This methodological inadequacy was addressed by a modification of the conventional paired-unpaired treatment protocol. In the new protocol, the animal is sequentially placed into two test compartments with the drug treatment administered in conjunction with placement into the second test compartment. This design permits a differentiation of a pavlovian conditioned drug responses from non-conditioned drug effects through continuous measurement of the non-drug behavioral baseline in both the drug and non-drug control treatment groups combined with multiple response measurements and post-treatment tests for conditioning at variable post-conditioning intervals. The present study details the use of the new modified pavlovian protocol with repeated cocaine (10 mg/kg) treatment. A cocaine conditioned response at 1, 7, and 21 days post-conditioning was identified and distinguished from habituation and stress effects.

  8. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej; Pereší ni, Peter; Kostić, Dejan; Canini, Marco

    2018-01-01

    and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a

  9. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  10. Measuring Effectiveness in Digital Game-Based Learning: A Methodological Review.

    Directory of Open Access Journals (Sweden)

    Anissa All

    2014-06-01

    Full Text Available In recent years, a growing number of studies are being conducted into the effectiveness of digital game-based learning (DGBL. Despite this growing interest, there is a lack of sound empirical evidence on the effectiveness of DGBL due to different outcome measures for assessing effectiveness, varying methods of data collection and inconclusive or difficult to interpret results. This has resulted in a need for an overarching methodology for assessing the effectiveness of DGBL. The present study took a first step in this direction by mapping current methods used for assessing the effectiveness of DGBL. Results showed that currently, comparison of results across studies and thus looking at effectiveness of DGBL on a more general level is problematic due to diversity in and suboptimal study designs. Variety in study design relates to three issues, namely different activities that are implemented in the control groups, different measures for assessing the effectiveness of DGBL and the use of different statistical techniques for analyzing learning outcomes. Suboptimal study designs are the result of variables confounding study results. Possible confounds that were brought forward in this review are elements that are added to the game as part of the educational intervention (e.g., required reading, debriefing session, instructor influences and practice effects when using the same test pre- and post-intervention. Lastly, incomplete information on the study design impedes replication of studies and thus falsification of study results.

  11. Air-water flow measurement for ERVC conditions by LIF/PIV

    International Nuclear Information System (INIS)

    Yoon, Jong Woong; Jeong, Yong Hoon

    2016-01-01

    Critical heat flux (CHF) of the external reactor vessel wall is a safety limit that indicate the integrity of the reactor vessel during the situation. Many research conducted CHF experiments in the IVR-ERVC conditions. However, the flow velocity field which is an important factor in the CHF mechanism were not studied enough in the IVR-ERVC situations. In this study, flow measurements including velocity vector field and the liquid velocity in the IVR-ERVC conditions were studied. The air-water two phase flow loop simulating IVRERVC conditions was set up and liquid velocity field was measured by LIF/PIV technique in this study. The experiment was conducted with and without air injection conditions. For the air-water flow experiment, liquid velocity at the outside of two phase boundary layer became higher and the two phase boundary layer thickness became smaller when the mass flux increases. The velocity data obtained in this study are expected to improve the CHF correlation in the IVR-ERVC situations.

  12. Air-water flow measurement for ERVC conditions by LIF/PIV

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Jong Woong; Jeong, Yong Hoon [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    Critical heat flux (CHF) of the external reactor vessel wall is a safety limit that indicate the integrity of the reactor vessel during the situation. Many research conducted CHF experiments in the IVR-ERVC conditions. However, the flow velocity field which is an important factor in the CHF mechanism were not studied enough in the IVR-ERVC situations. In this study, flow measurements including velocity vector field and the liquid velocity in the IVR-ERVC conditions were studied. The air-water two phase flow loop simulating IVRERVC conditions was set up and liquid velocity field was measured by LIF/PIV technique in this study. The experiment was conducted with and without air injection conditions. For the air-water flow experiment, liquid velocity at the outside of two phase boundary layer became higher and the two phase boundary layer thickness became smaller when the mass flux increases. The velocity data obtained in this study are expected to improve the CHF correlation in the IVR-ERVC situations.

  13. Speciated arsenic in air: measurement methodology and risk assessment considerations.

    Science.gov (United States)

    Lewis, Ari S; Reid, Kim R; Pollock, Margaret C; Campleman, Sharan L

    2012-01-01

    Accurate measurement of arsenic (As) in air is critical to providing a more robust understanding of arsenic exposures and associated human health risks. Although there is extensive information available on total arsenic in air, less is known on the relative contribution of each arsenic species. To address this data gap, the authors conducted an in-depth review of available information on speciated arsenic in air. The evaluation included the type of species measured and the relative abundance, as well as an analysis of the limitations of current analytical methods. Despite inherent differences in the procedures, most techniques effectively separated arsenic species in the air samples. Common analytical techniques such as inductively coupled plasma mass spectrometry (ICP-MS) and/or hydride generation (HG)- or quartz furnace (GF)-atomic absorption spectrometry (AAS) were used for arsenic measurement in the extracts, and provided some of the most sensitive detection limits. The current analysis demonstrated that, despite limited comparability among studies due to differences in seasonal factors, study duration, sample collection methods, and analytical methods, research conducted to date is adequate to show that arsenic in air is mainly in the inorganic form. Reported average concentrations of As(III) and As(V) ranged up to 7.4 and 10.4 ng/m3, respectively, with As(V) being more prevalent than As(III) in most studies. Concentrations of the organic methylated arsenic compounds are negligible (in the pg/m3 range). However because of the variability in study methods and measurement methodology, the authors were unable to determine the variation in arsenic composition as a function of source or particulate matter (PM) fraction. In this work, the authors include the implications of arsenic speciation in air on potential exposure and risks. The authors conclude that it is important to synchronize sample collection, preparation, and analytical techniques in order to generate

  14. An Updated Methodology for Enhancing Risk Monitors with Integrated Equipment Condition Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hirt, Evelyn H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coles, Garill A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bonebrake, Christopher A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ivans, William J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wootan, David W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mitchell, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-07-18

    Small modular reactors (SMRs) generally include reactors with electric output of ~350 MWe or less (this cutoff varies somewhat but is substantially less than full-size plant output of 700 MWe or more). Advanced SMRs (AdvSMRs) refer to a specific class of SMRs and are based on modularization of advanced reactor concepts. Enhancing affordability of AdvSMRs will be critical to ensuring wider deployment, as AdvSMRs suffer from loss of economies of scale inherent in small reactors when compared to large (~greater than 600 MWe output) reactors and the controllable day-to-day costs of AdvSMRs will be dominated by operation and maintenance (O&M) costs. Technologies that help characterize real-time risk are important for controlling O&M costs. Risk monitors are used in current nuclear power plants to provide a point-in-time estimate of the system risk given the current plant configuration (e.g., equipment availability, operational regime, and environmental conditions). However, current risk monitors are unable to support the capability requirements listed above as they do not take into account plant-specific normal, abnormal, and deteriorating states of active components and systems. This report documents technology developments towards enhancing risk monitors that, if integrated with supervisory plant control systems, can provide the capability requirements listed and meet the goals of controlling O&M costs. The report describes research results on augmenting an initial methodology for enhanced risk monitors that integrate real-time information about equipment condition and POF into risk monitors. Methods to propagate uncertainty through the enhanced risk monitor are evaluated. Available data to quantify the level of uncertainty and the POF of key components are examined for their relevance, and a status update of this data evaluation is described. Finally, we describe potential targets for developing new risk metrics that may be useful for studying trade-offs for economic

  15. Methodology for identifying parameters for the TRNSYS model Type 210 - wood pellet stoves and boilers

    Energy Technology Data Exchange (ETDEWEB)

    Persson, Tomas; Fiedler, Frank; Nordlander, Svante

    2006-05-15

    This report describes a method how to perform measurements on boilers and stoves and how to identify parameters from the measurements for the boiler/stove-model TRNSYS Type 210. The model can be used for detailed annual system simulations using TRNSYS. Experience from measurements on three different pellet stoves and four boilers were used to develop this methodology. Recommendations for the set up of measurements are given and the required combustion theory for the data evaluation and data preparation are given. The data evaluation showed that the uncertainties are quite large for the measured flue gas flow rate and for boilers and stoves with high fraction of energy going to the water jacket also the calculated heat rate to the room may have large uncertainties. A methodology for the parameter identification process and identified parameters for two different stoves and three boilers are given. Finally the identified models are compared with measured data showing that the model generally agreed well with measured data during both stationary and dynamic conditions.

  16. Phasor Measurement Unit under Interference Conditions

    DEFF Research Database (Denmark)

    Ghiga, Radu; Martin, Kenneth E.; Wu, Qiuwei

    2017-01-01

    interference condition scenarios. In the first scenario, noise is added to the PMU input signal. The test runs a sweep of Signalto-Noise Ratios (SNR) and the accuracy versus the noise level is obtained. The second scenario injects multiple harmonics with the input to test the influence on accuracy. The last...... scenario focuses on instrument transformer saturation which leads to a modified waveform injected in the PMU. This test goes through different levels of Current Transformer (CT) saturation and analyzes the effect of saturation on the accuracy of PMUs. The test results show PMU measurements will be degraded...

  17. Three-dimensional sensing methodology combining stereo vision and phase-measuring profilometry based on dynamic programming

    Science.gov (United States)

    Lee, Hyunki; Kim, Min Young; Moon, Jeon Il

    2017-12-01

    Phase measuring profilometry and moiré methodology have been widely applied to the three-dimensional shape measurement of target objects, because of their high measuring speed and accuracy. However, these methods suffer from inherent limitations called a correspondence problem, or 2π-ambiguity problem. Although a kind of sensing method to combine well-known stereo vision and phase measuring profilometry (PMP) technique simultaneously has been developed to overcome this problem, it still requires definite improvement for sensing speed and measurement accuracy. We propose a dynamic programming-based stereo PMP method to acquire more reliable depth information and in a relatively small time period. The proposed method efficiently fuses information from two stereo sensors in terms of phase and intensity simultaneously based on a newly defined cost function of dynamic programming. In addition, the important parameters are analyzed at the view point of the 2π-ambiguity problem and measurement accuracy. To analyze the influence of important hardware and software parameters related to the measurement performance and to verify its efficiency, accuracy, and sensing speed, a series of experimental tests were performed with various objects and sensor configurations.

  18. A brief review of strength and ballistic assessment methodologies in sport.

    Science.gov (United States)

    McMaster, Daniel Travis; Gill, Nicholas; Cronin, John; McGuigan, Michael

    2014-05-01

    An athletic profile should encompass the physiological, biomechanical, anthropometric and performance measures pertinent to the athlete's sport and discipline. The measurement systems and procedures used to create these profiles are constantly evolving and becoming more precise and practical. This is a review of strength and ballistic assessment methodologies used in sport, a critique of current maximum strength [one-repetition maximum (1RM) and isometric strength] and ballistic performance (bench throw and jump capabilities) assessments for the purpose of informing practitioners and evolving current assessment methodologies. The reliability of the various maximum strength and ballistic assessment methodologies were reported in the form of intra-class correlation coefficients (ICC) and coefficient of variation (%CV). Mean percent differences (Mdiff = [/Xmethod1 - Xmethod2/ / (Xmethod1 + Xmethod2)] x 100) and effect size (ES = [Xmethod2 - Xmethod1] ÷ SDmethod1) calculations were used to assess the magnitude and spread of methodological differences for a given performance measure of the included studies. Studies were grouped and compared according to their respective performance measure and movement pattern. The various measurement systems (e.g., force plates, position transducers, accelerometers, jump mats, optical motion sensors and jump-and-reach apparatuses) and assessment procedures (i.e., warm-up strategies, loading schemes and rest periods) currently used to assess maximum isometric squat and mid-thigh pull strength (ICC > 0.95; CV 0.91; CV ballistic (vertical jump and bench throw) capabilities (ICC > 0.82; CV ballistic performance in recreational and elite athletes, alike. However, the reader needs to be cognisant of the inherent differences between measurement systems, as selection will inevitably affect the outcome measure. The strength and conditioning practitioner should also carefully consider the benefits and limitations of the different measurement

  19. Scanner image methodology (SIM) to measure dimensions of leaves ...

    African Journals Online (AJOL)

    A scanner image methodology was used to determine plant dimensions, such as leaf area, length and width. The values obtained using SIM were compared with those recorded by the LI-COR leaf area meter. Bias, linearity, reproducibility and repeatability (R&R) were evaluated for SIM. Different groups of leaves were ...

  20. Improving inferior vena cava filter retrieval rates with the define, measure, analyze, improve, control methodology.

    Science.gov (United States)

    Sutphin, Patrick D; Reis, Stephen P; McKune, Angie; Ravanzo, Maria; Kalva, Sanjeeva P; Pillai, Anil K

    2015-04-01

    To design a sustainable process to improve optional inferior vena cava (IVC) filter retrieval rates based on the Define, Measure, Analyze, Improve, Control (DMAIC) methodology of the Six Sigma process improvement paradigm. DMAIC, an acronym for Define, Measure, Analyze, Improve, and Control, was employed to design and implement a quality improvement project to increase IVC filter retrieval rates at a tertiary academic hospital. Retrievable IVC filters were placed in 139 patients over a 2-year period. The baseline IVC filter retrieval rate (n = 51) was reviewed through a retrospective analysis, and two strategies were devised to improve the filter retrieval rate: (a) mailing of letters to clinicians and patients for patients who had filters placed within 8 months of implementation of the project (n = 43) and (b) a prospective automated scheduling of a clinic visit at 4 weeks after filter placement for all new patients (n = 45). The effectiveness of these strategies was assessed by measuring the filter retrieval rates and estimated increase in revenue to interventional radiology. IVC filter retrieval rates increased from a baseline of 8% to 40% with the mailing of letters and to 52% with the automated scheduling of a clinic visit 4 weeks after IVC filter placement. The estimated revenue per 100 IVC filters placed increased from $2,249 to $10,518 with the mailing of letters and to $17,022 with the automated scheduling of a clinic visit. Using the DMAIC methodology, a simple and sustainable quality improvement intervention was devised that markedly improved IVC filter retrieval rates in eligible patients. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  1. In situ measurement of heavy metals in water using portable EDXRF and APDC pre-concentration methodology

    International Nuclear Information System (INIS)

    Melquiades, Fabio L.; Parreira, Paulo S.; Appoloni, Carlos R.; Silva, Wislley D.; Lopes, Fabio

    2007-01-01

    With the objective of identify and quantify metals in water and obtain results in the sampling place, Energy Dispersive X-Ray Fluorescence (EDXRF) methodology with a portable equipment was employed. In this work are presented metal concentration results for water samples from two points of Londrina city. The analysis were in situ, measuring in natura water and samples pre-concentrated in membranes. The work consisted on the use of a portable X-ray tube to excite the samples and a Si-Pin detector with the standard data acquisition electronics to register the spectra. The samples were filtered in membranes for suspended particulate matter retention. After this APDC precipitation methodology was applied for sample pre-concentration with posterior filtering in membranes. For in natura samples were found concentrations of total iron in Capivara River 254 ± 30 mg L -1 and at Igapo Lake 63 ± 9 mg L -1 . For membrane measurements, the results for particulate suspended matter at Capivara River were, in mg L -1 : 31.0 ± 2.5 (Fe), 0.17 ± 0.03 (Cu) and 0.93 ± 0.08 (Pb) and for dissolved iron was 0.038 ± 0.004. For Igapo Lake just Fe was quantified: 1.66 ±0.19 mg L -1 for particulate suspended iron and 0.79 ± 0.11 mg L -1 for dissolved iron. In 4 h of work at field it was possible to filter 14 membranes and measure around 16 samples. The performance of the equipment was very good and the results are satisfactory for in situ measurements employing a portable instrument. (author)

  2. Ozone, spectral irradiance and aerosol measurements with the Brewer spectro radiometer

    International Nuclear Information System (INIS)

    Marenco, F.; Di Sarra, A.

    2001-01-01

    In this technical report a detailed description of the Brewer spectro radiometer, a widespread instrument for ozone and ultraviolet radiation, is given. The methodologies used to measure these quantities and for instrument calibration are described in detail. Finally a new methodology, developed by ENEA to derive the aerosol optical depth from the Brewer routine total ozone measurements, is described. This methodology is based on Langley extrapolation, on the determination of the transmissivity of the Brewer neutral density filters, and on a statistically significant number of half days of measurements obtained in could-free conditions. Results of this method, obtained with the Brewer of the ENEA station for climate observations Roberto Sarao, located in the island of Lampedusa, are reported. These results confirm the validity of the method, thanks to independent measurements taken in 1999 with a Multi filter Rotating Shadow band Radiometer. This methodology allows researchers to obtain an aerosol climatology from ozone measurements obtained at several sites world-wide [it

  3. Providing hierarchical approach for measuring supply chain performance using AHP and DEMATEL methodologies

    Directory of Open Access Journals (Sweden)

    Ali Najmi

    2010-06-01

    Full Text Available Measuring the performance of a supply chain is normally of a function of various parameters. Such a problem often involves in a multiple criteria decision making (MCMD problem where different criteria need to be defined and calculated, properly. During the past two decades, Analytical hierarchy procedure (AHP and DEMATEL have been some of the most popular MCDM approaches for prioritizing various attributes. The study of this paper uses a new methodology which is a combination of AHP and DEMATEL to rank various parameters affecting the performance of the supply chain. The DEMATEL is used for understanding the relationship between comparison metrics and AHP is used for the integration to provide a value for the overall performance.

  4. A methodology for the measure of secondary homes tourist flows at municipal level

    Directory of Open Access Journals (Sweden)

    Andrea Guizzardi

    2007-10-01

    Full Text Available The present public statistical system does not provide information concerning second houses touristic flows at sub-regional level. The lack limits local administrations' capabilities to take decisions about either: environmental, territorial and productive development, as well as regional governments in fair allocation of public financing. In the work, this information lack is overcome by proposing an indirect estimation methodology. Municipalities electric power consumption is proposed as an indicator of the stays on secondary homes. The indicator is connected to tourism flows considering both measurement errors and factors, modifying the local power demand. The application to Emilia-Romagna regional case allow to verify results’ coherence with officials statistics, as weel as to assess municipalities’ tourist vocation.

  5. A Case Study of Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC Methodology in Garment Sector

    Directory of Open Access Journals (Sweden)

    Abdur Rahman

    2017-12-01

    Full Text Available This paper demonstrates the empirical application of Six Sigma and Define-Measure-Analyze-Improve-Control (DMAIC methodology to reduce product defects within a garments manufacturing organization in Bangladesh which follows the DMAIC methodology to investigate defects, root causes and provide a solution to eliminate these defects. The analysis from employing Six Sigma and DMAIC indicated that the broken stitch and open seam influenced the number of defective products. Design of experiments (DOE and the analysis of variance (ANOVA techniques were combined to statistically determine the correlation of the broken stitch and open seam with defects as well as to define their optimum values needed to eliminate the defects. Thus, a reduction of about 35% in the garments defect was achieved, which helped the organization studied to reduce its defects and thus improve its Sigma level from 1.7 to 3.4.

  6. Quantification of Greenhouse Gas Emission Rates from strong Point Sources by Airborne IPDA-Lidar Measurements: Methodology and Experimental Results

    Science.gov (United States)

    Ehret, G.; Amediek, A.; Wirth, M.; Fix, A.; Kiemle, C.; Quatrevalet, M.

    2016-12-01

    We report on a new method and on the first demonstration to quantify emission rates from strong greenhouse gas (GHG) point sources using airborne Integrated Path Differential Absorption (IPDA) Lidar measurements. In order to build trust in the self-reported emission rates by countries, verification against independent monitoring systems is a prerequisite to check the reported budget. A significant fraction of the total anthropogenic emission of CO2 and CH4 originates from localized strong point sources of large energy production sites or landfills. Both are not monitored with sufficiently accuracy by the current observation system. There is a debate whether airborne remote sensing could fill in the gap to infer those emission rates from budgeting or from Gaussian plume inversion approaches, whereby measurements of the GHG column abundance beneath the aircraft can be used to constrain inverse models. In contrast to passive sensors, the use of an active instrument like CHARM-F for such emission verification measurements is new. CHARM-F is a new airborne IPDA-Lidar devised for the German research aircraft HALO for the simultaneous measurement of the column-integrated dry-air mixing ratio of CO2 and CH4 commonly denoted as XCO2 und XCH4, respectively. It has successfully been tested in a serious of flights over Central Europe to assess its performance under various reflectivity conditions and in a strongly varying topography like the Alps. The analysis of a methane plume measured in crosswind direction of a coal mine ventilation shaft revealed an instantaneous emission rate of 9.9 ± 1.7 kt CH4 yr-1. We discuss the methodology of our point source estimation approach and give an outlook on the CoMet field experiment scheduled in 2017 for the measurement of anthropogenic and natural GHG emissions by a combination of active and passive remote sensing instruments on research aircraft.

  7. Nuclear power plant simulation facility evaluation methodology

    International Nuclear Information System (INIS)

    Haas, P.M.; Carter, R.J.; Laughery, K.R. Jr.

    1985-01-01

    A methodology for evaluation of nuclear power plant simulation facilities with regard to their acceptability for use in the US Nuclear Regulatory Commission (NRC) operator licensing exam is described. The evaluation is based primarily on simulator fidelity, but incorporates some aspects of direct operator/trainee performance measurement. The panel presentation and paper discuss data requirements, data collection, data analysis and criteria for conclusions regarding the fidelity evaluation, and summarize the proposed use of direct performance measurment. While field testing and refinement of the methodology are recommended, this initial effort provides a firm basis for NRC to fully develop the necessary methodology

  8. Enzymatic Phorbol Esters Degradation using the Germinated Jatropha Curcas Seed Lipase as Biocatalyst: Optimization Process Conditions by Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Avita Kusuma Wardhani

    2016-10-01

    Full Text Available Utilization of Jatropha curcas seed cake is limited by the presence of phorbol esters (PE, which are the main toxic compound and heat stable. The objective of this research was to optimize the reaction conditions of the enzymatic PE degradation of the defatted Jatropha curcas seed cake (DJSC using the acetone-dried lipase from the germinated Jatropha curcas seeds as a biocatalyst. Response Surface Methodology (RSM using three-factors-three-levels Box-Behnken design was used to evaluate the effects of the reaction time, the ratio of buffer volume to DJSC, and the ratio of enzyme to DJSC on PE degradation. The results showed that the optimum conditions of PE degradation were 29.33 h, 51.11 : 6 (mL/g, and 30.10 : 5 (U/g cake for the reaction time, the ratio of buffer volume to DJSC, and the ratio of enzyme to DJSC, respectively. The predicted degradation of PE was 98.96% and not significantly different with the validated data of PE degradation. PE content was 0.035 mg/g, in which it was lower than PE in non-toxic Jatropha seeds. The results indicated that enzymatic degradation of PE might be a promising method for degradation of PE.  Copyright © 2016 BCREC GROUP. All rights reserved Received: 22nd December 2015; Revised: 1st April 2016; Accepted: 14th April 2016 How to Cite: Wardhani, A.K., Hidayat, C., Hastuti, P. (2016. Enzymatic Phorbol Esters Degradation using the Germinated Jatropha Curcas Seed Lipase as Biocatalyst: Optimization Process Conditions by Response Surface Methodology. Bulletin of Chemical Reaction Engineering & Catalysis, 11 (3: 346-353 (doi:10.9767/bcrec.11.3.574.346-353 Permalink/DOI: http://doi.org/10.9767/bcrec.11.3.574.346-353

  9. Measurement properties of tools used to assess depression in adults with and without autism spectrum conditions: A systematic review.

    Science.gov (United States)

    Cassidy, S A; Bradley, L; Bowen, E; Wigham, S; Rodgers, J

    2018-01-23

    Depression is the most commonly experienced mental health condition in adults with autism spectrum conditions (ASC). However, it is unclear what tools are currently being used to assess depression in ASC, or whether tools need to be adapted for this group. This systematic review therefore aimed to identify tools used to assess depression in adults with and without ASC, and then evaluate these tools for their appropriateness and measurement properties. Medline, PsychINFO and Web of Knowledge were searched for studies of depression in: (a) adults with ASC, without co-morbid intellectual disability; and (b) adults from the general population without co-morbid conditions. Articles examining the measurement properties of these tools were then searched for using a methodological filter in PubMed, and the quality of the evidence was evaluated using the COSMIN checklist. Twelve articles were identified which utilized three tools to assess depression in adults with ASC, but only one article which assessed the measurement properties of one of these tools was identified and thus evaluated. Sixty-four articles were identified which utilized five tools to assess depression in general population adults, and fourteen articles had assessed the measurement properties of these tools. Overall, two tools were found to be robust in their measurement properties in the general population-the Beck Depression Inventory (BDI-II), and the patient health questionnaire (PHQ-9). Crucially only one study was identified from the COSMIN search, which showed weak evidence in support of the measurement properties of the BDI-II in an ASC sample. Implications for effective measurement of depression in ASC are discussed. Autism Res 2018. © 2018 The Authors Autism Research published by International Society for Autism Research and Wiley Periodicals, Inc. Depression is the most common mental health problem experienced by adults with autism. However, the current study found very limited evidence

  10. Quality control of CT units - methodology of performance I

    International Nuclear Information System (INIS)

    Prlic, I.; Radalj, Z.

    1996-01-01

    Increasing use of x-ray computed tomography systems (CT scanners) in the diagnostic requires an efficient means of evaluating the performance of them. Therefore, this paper presents the way to measure (Quality Control procedure-Q/C) and define the CT scanner performance through a special phantom which is based on the recommendation of the American association of Physicists in Medicine (AAPM). The performance parameters measurable with the phantom represent the capability, so periodical evaluation of the parameters enable the users to recognize the stability of the CT scanner no matter on the manufacturer, model or software option of the scanner. There are five important performance parameters which are to be measured: Noise, Contrast scale, Nominal tomographic section thickness, High and Low contrast resolution (MTF). The sixth parameter is, of course the dose per scan and slice which gives the patient dose for the certain diagnostic procedure. The last but not the least parameter is the final image quality which is given through the image processing device connected to the scanner. This is the final medical information needed for the good medical practice according to the Quality Assurance (Q/A) procedures in diagnostic radiology. We have to assure the results of the performance evaluation without environmental influences (the measurements are to be made under the certain conditions according Q/A). This paper will give no detailed methodology recipe but will show on the one example; the system noise measurements and linearity; the need and relevant results of the measurements.1 The rest of the methodology is to be published. (author)

  11. The COSMIN checklist for assessing the methodological quality of studies on measurement properties of health status measurement instruments: an international Delphi study.

    Science.gov (United States)

    Mokkink, Lidwine B; Terwee, Caroline B; Patrick, Donald L; Alonso, Jordi; Stratford, Paul W; Knol, Dirk L; Bouter, Lex M; de Vet, Henrica C W

    2010-05-01

    Aim of the COSMIN study (COnsensus-based Standards for the selection of health status Measurement INstruments) was to develop a consensus-based checklist to evaluate the methodological quality of studies on measurement properties. We present the COSMIN checklist and the agreement of the panel on the items of the checklist. A four-round Delphi study was performed with international experts (psychologists, epidemiologists, statisticians and clinicians). Of the 91 invited experts, 57 agreed to participate (63%). Panel members were asked to rate their (dis)agreement with each proposal on a five-point scale. Consensus was considered to be reached when at least 67% of the panel members indicated 'agree' or 'strongly agree'. Consensus was reached on the inclusion of the following measurement properties: internal consistency, reliability, measurement error, content validity (including face validity), construct validity (including structural validity, hypotheses testing and cross-cultural validity), criterion validity, responsiveness, and interpretability. The latter was not considered a measurement property. The panel also reached consensus on how these properties should be assessed. The resulting COSMIN checklist could be useful when selecting a measurement instrument, peer-reviewing a manuscript, designing or reporting a study on measurement properties, or for educational purposes.

  12. Using a model of the performance measures in Soft Systems Methodology (SSM) to take action: a case study in health care

    NARCIS (Netherlands)

    Kotiadis, K.; Tako, A.; Rouwette, E.A.J.A.; Vasilakis, C.; Brennan, J.; Gandhi, P.; Wegstapel, H.; Sagias, F.; Webb, P.

    2013-01-01

    This paper uses a case study of a multidisciplinary colorectal cancer team in health care to explain how a model of performance measures can lead to debate and action in Soft System Methodology (SSM). This study gives a greater emphasis and role to the performance measures than currently given in

  13. Measuring domestic water use: a systematic review of methodologies that measure unmetered water use in low-income settings.

    Science.gov (United States)

    Tamason, Charlotte C; Bessias, Sophia; Villada, Adriana; Tulsiani, Suhella M; Ensink, Jeroen H J; Gurley, Emily S; Mackie Jensen, Peter Kjaer

    2016-11-01

    To present a systematic review of methods for measuring domestic water use in settings where water meters cannot be used. We systematically searched EMBASE, PubMed, Water Intelligence Online, Water Engineering and Development Center, IEEExplore, Scielo, and Science Direct databases for articles that reported methodologies for measuring water use at the household level where water metering infrastructure was absent or incomplete. A narrative review explored similarities and differences between the included studies and provide recommendations for future research in water use. A total of 21 studies were included in the review. Methods ranged from single-day to 14-consecutive-day visits, and water use recall ranged from 12 h to 7 days. Data were collected using questionnaires, observations or both. Many studies only collected information on water that was carried into the household, and some failed to mention whether water was used outside the home. Water use in the selected studies was found to range from two to 113 l per capita per day. No standardised methods for measuring unmetered water use were found, which brings into question the validity and comparability of studies that have measured unmetered water use. In future studies, it will be essential to define all components that make up water use and determine how they will be measured. A pre-study that involves observations and direct measurements during water collection periods (these will have to be determined through questioning) should be used to determine optimal methods for obtaining water use information in a survey. Day-to-day and seasonal variation should be included. A study that investigates water use recall is warranted to further develop standardised methods to measure water use; in the meantime, water use recall should be limited to 24 h or fewer. © 2016 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  14. Review of effectiveness-evaluation methodologies for safeguards and security systems

    International Nuclear Information System (INIS)

    Dowdy, E.J.; Mangan, D.L.

    1982-06-01

    We discuss the factors that influence the effectiveness of safeguards and security measures and the characteristics required of effectiveness evaluation methodologies. Within this context and from a utility standpoint, we review those effectiveness evaluation methodologies that have been developed. Our principal recommendation concerns the application and concomitant validation of existing methodologies. This recommendation derives from our conclusion that there has been a gross imbalance between the effort spent on methodology development and the application of those methodologies. Only for those safeguards measures that do not seem to be covered by existing methodologies or that seem to be inadequately covered do we suggest development. 44 references

  15. Instrumentation for localized measurements in two-phase flow conditions

    International Nuclear Information System (INIS)

    Neff, G.G.; Averill, R.H.; Shurts, S.W.

    1979-01-01

    Three types of instrumentation that have been developed by EG and G Idaho, Inc., and its predecessor, Aerojet Nuclear company, at the Idaho National Engineering Laboratory to investigate two-phase flow phenomenon in a nuclear reactor at the Loss-of-Fluid Test (LOFT) facility are discussed: (a) a combination drag disc-turbine transducer (DTT), (b) a multibeam nuclear hardened gamma densitometer system, and (c) a conductivity sensitive liquid level transducer (LLT). The DTT obtains data on the complex problem of two-phase flow conditions in the LOFT primary coolant system during a loss-os-coolant experiment (LOCE). The discussion of the DTT describes how a turbine, measuring coolant velocity, and a drag disc, measuring coolant momentum flux, can provide valuable mass flow data. The nuclear hardened gamma densitometer is used to obtain density and flow regime information for two-phase flow in the LOFT primary coolant system during a LOCE. The LLT is used to measure water and steam conditions within the LOFT reactor core during a LOCE. The LLT design and the type of data obtained are described

  16. A novel methodology for online measurement of thoron using Lucas scintillation cell

    International Nuclear Information System (INIS)

    Eappen, K.P.; Sapra, B.K.; Mayya, Y.S.

    2007-01-01

    The use of Lucas scintillation cell (LSC) technique for thoron estimation requires a modified methodology as opposed to radon estimation. While in the latter, the α counting is performed after a delay period varying between few hours to few days, in the case of thoron estimation the α counting has to be carried out immediately after sampling owing to the short half-life of thoron (55 s). This can be achieved best by having an on-line LSC sampling and counting system. However, half-life of the thoron decay product 212 Pb being 10.6 h, the background accumulates in LSC during online measurements and hence subsequent use of LSC is erroneous unless normal background level is achieved in the cell. This problem can be circumvented by correcting for the average background counts accumulated during the counting period which may be theoretically estimated. In this study, a methodology has been developed to estimate the true counts due to thoron. A linear regression between the counts obtained experimentally and the fractional decay in regular intervals of time is used to obtain the actual thoron concentration. The novelty of this approach is that the background of the cell is automatically estimated as the intercept of the regression graph. The results obtained by this technique compare well with the two filter method and the thoron concentration produced from a standard thoron source. However, the LSC as such cannot be used for environmental samples because the minimum detection level is comparable with that of thoron concentrations prevailing in normal atmosphere

  17. Methodology for the conceptual design of solar kitchens

    International Nuclear Information System (INIS)

    Macia G, A F; Estrada V, D A; Chejne J, F; Velasquez, H I; Rengifo, R

    2005-01-01

    A detailed description of the methodology for the conceptual design of solar kitchens has appeared, which allows its detailed design. The methodology is based on three main phases that natural and has been very intuitively identified given to the characteristics and conditions of the project: conceptual phase, detail phase and execution phase

  18. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  19. Radiation measurements during cavities conditioning on APS RF test stand

    International Nuclear Information System (INIS)

    Grudzien, D.M.; Kustom, R.L.; Moe, H.J.; Song, J.J.

    1993-01-01

    In order to determine the shielding structure around the Advanced Photon Source (APS) synchrotron and storage ring RF stations, the X-ray radiation has been measured in the near field and far field regions of the RF cavities during the normal conditioning process. Two cavity types, a prototype 352-MHz single-cell cavity and a 352-MHz five-cell cavity, are used on the APS and are conditioned in the RF test stand. Vacuum measurements are also taken on a prototype 352-MHz single-cell cavity and a 352-MHz five-cell cavity. The data will be compared with data on the five-cell cavities from CERN

  20. A methodological frame for assessing benzene induced leukemia risk mitigation due to policy measures

    International Nuclear Information System (INIS)

    Karakitsios, Spyros P.; Sarigiannis, Dimosthenis A.; Gotti, Alberto; Kassomenos, Pavlos A.; Pilidis, Georgios A.

    2013-01-01

    The study relies on the development of a methodology for assessing the determinants that comprise the overall leukemia risk due to benzene exposure and how these are affected by outdoor and indoor air quality regulation. An integrated modeling environment was constructed comprising traffic emissions, dispersion models, human exposure models and a coupled internal dose/biology-based dose–response risk assessment model, in order to assess the benzene imposed leukemia risk, as much as the impact of traffic fleet renewal and smoking banning to these levels. Regarding traffic fleet renewal, several “what if” scenarios were tested. The detailed full-chain methodology was applied in a South-Eastern European urban setting in Greece and a limited version of the methodology in Helsinki. Non-smoking population runs an average risk equal to 4.1 · 10 −5 compared to 23.4 · 10 −5 for smokers. The estimated lifetime risk for the examined occupational groups was higher than the one estimated for the general public by 10–20%. Active smoking constitutes a dominant parameter for benzene-attributable leukemia risk, much stronger than any related activity, occupational or not. From the assessment of mitigation policies it was found that the associated leukemia risk in the optimum traffic fleet scenario could be reduced by up to 85% for non-smokers and up to 8% for smokers. On the contrary, smoking banning provided smaller gains for (7% for non-smokers, 1% for smokers), while for Helsinki, smoking policies were found to be more efficient than traffic fleet renewal. The methodology proposed above provides a general framework for assessing aggregated exposure and the consequent leukemia risk from benzene (incorporating mechanistic data), capturing exposure and internal dosimetry dynamics, translating changes in exposure determinants to actual changes in population risk, providing a valuable tool for risk management evaluation and consequently to policy support. - Highlights

  1. A methodological frame for assessing benzene induced leukemia risk mitigation due to policy measures

    Energy Technology Data Exchange (ETDEWEB)

    Karakitsios, Spyros P. [Aristotle University of Thessaloniki, Department of Chemical Engineering, 54124 Thessaloniki (Greece); Sarigiannis, Dimosthenis A., E-mail: denis@eng.auth.gr [Aristotle University of Thessaloniki, Department of Chemical Engineering, 54124 Thessaloniki (Greece); Centre for Research and Technology Hellas (CE.R.T.H.), 57001, Thessaloniki (Greece); Gotti, Alberto [Centre for Research and Technology Hellas (CE.R.T.H.), 57001, Thessaloniki (Greece); Kassomenos, Pavlos A. [University of Ioannina, Department of Physics, Laboratory of Meteorology, GR-45110 Ioannina (Greece); Pilidis, Georgios A. [University of Ioannina, Department of Biological Appl. and Technologies, GR-45110 Ioannina (Greece)

    2013-01-15

    The study relies on the development of a methodology for assessing the determinants that comprise the overall leukemia risk due to benzene exposure and how these are affected by outdoor and indoor air quality regulation. An integrated modeling environment was constructed comprising traffic emissions, dispersion models, human exposure models and a coupled internal dose/biology-based dose–response risk assessment model, in order to assess the benzene imposed leukemia risk, as much as the impact of traffic fleet renewal and smoking banning to these levels. Regarding traffic fleet renewal, several “what if” scenarios were tested. The detailed full-chain methodology was applied in a South-Eastern European urban setting in Greece and a limited version of the methodology in Helsinki. Non-smoking population runs an average risk equal to 4.1 · 10{sup −5} compared to 23.4 · 10{sup −5} for smokers. The estimated lifetime risk for the examined occupational groups was higher than the one estimated for the general public by 10–20%. Active smoking constitutes a dominant parameter for benzene-attributable leukemia risk, much stronger than any related activity, occupational or not. From the assessment of mitigation policies it was found that the associated leukemia risk in the optimum traffic fleet scenario could be reduced by up to 85% for non-smokers and up to 8% for smokers. On the contrary, smoking banning provided smaller gains for (7% for non-smokers, 1% for smokers), while for Helsinki, smoking policies were found to be more efficient than traffic fleet renewal. The methodology proposed above provides a general framework for assessing aggregated exposure and the consequent leukemia risk from benzene (incorporating mechanistic data), capturing exposure and internal dosimetry dynamics, translating changes in exposure determinants to actual changes in population risk, providing a valuable tool for risk management evaluation and consequently to policy support

  2. Optimization of the Conditions for Extraction of Serine Protease from Kesinai Plant (Streblus asper Leaves Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Md. Zaidul Islam Sarker

    2011-11-01

    Full Text Available Response surface methodology (RSM using a central composite design (CCD was employed to optimize the conditions for extraction of serine protease from kesinai (Streblus asper leaves. The effect of independent variables, namely temperature (42.5,47.5, X1, mixing time (2–6 min, X2, buffer content (0–80 mL, X3 and buffer pH (4.5–10.5, X4 on specific activity, storage stability, temperature and oxidizing agent stability of serine protease from kesinai leaves was investigated. The study demonstrated that use of the optimum temperature, mixing time, buffer content and buffer pH conditions protected serine protease during extraction, as demonstrated by low activity loss. It was found that the interaction effect of mixing time and buffer content improved the serine protease stability, and the buffer pH had the most significant effect on the specific activity of the enzyme. The most desirable conditions of 2.5 °C temperature, 4 min mixing time, 40 mL buffer at pH 7.5 was established for serine protease extraction from kesinai leaves.

  3. Optimization of conditions for probiotic curd formulation by Enterococcus faecium MTCC 5695 with probiotic properties using response surface methodology.

    Science.gov (United States)

    Ramakrishnan, Vrinda; Goveas, Louella Concepta; Prakash, Maya; Halami, Prakash M; Narayan, Bhaskar

    2014-11-01

    Enterococcus faecium MTCC 5695 possessing potential probiotic properties as well as enterocin producing ability was used as starter culture. Effect of time (12-24 h) and inoculum level (3-7 % v/v) on cell growth, bacteriocin production, antioxidant property, titrable acidity and pH of curd was studied by response surface methodology (RSM). The optimized conditions were 26.48 h and 2.17%v/v inoculum and the second order model validated. Co cultivation studies revealed that the formulated product had the ability to prevent growth of foodborne pathogens that affect keeping quality of the product during storage. The results indicated that application of E. faecium MTCC 5695 along with usage of optimized conditions attributed to the formation of highly consistent well set curd with bioactive and bioprotective properties. Formulated curd with potential probiotic attributes can be used as therapeutic agent for the treatment of foodborne diseases like Traveler's diarrhea and gastroenteritis which thereby help in improvement of bowel health.

  4. Fusion integral experiments and analysis and the determination of design safety factors - I: Methodology

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Kumar, A.; Abdou, M.A.; Oyama, Y.; Maekawa, H.

    1995-01-01

    The role of the neutronics experimentation and analysis in fusion neutronics research and development programs is discussed. A new methodology was developed to arrive at estimates to design safety factors based on the experimental and analytical results from design-oriented integral experiments. In this methodology, and for a particular nuclear response, R, a normalized density function (NDF) is constructed from the prediction uncertainties, and their associated standard deviations, as found in the various integral experiments where that response, R, is measured. Important statistical parameters are derived from the NDF, such as the global mean prediction uncertainty, and the possible spread around it. The method of deriving safety factors from many possible NDFs based on various calculational and measuring methods (among other variants) is also described. Associated with each safety factor is a confidence level, designers may choose to have, that the calculated response, R, will not exceed (or will not fall below) the actual measured value. An illustrative example is given on how to construct the NDFs. The methodology is applied in two areas, namely the line-integrated tritium production rate and bulk shielding integral experiments. Conditions under which these factors could be derived and the validity of the method are discussed. 72 refs., 17 figs., 4 tabs

  5. A hybrid measure-correlate-predict method for long-term wind condition assessment

    International Nuclear Information System (INIS)

    Zhang, Jie; Chowdhury, Souma; Messac, Achille; Hodge, Bri-Mathias

    2014-01-01

    Highlights: • A hybrid measure-correlate-predict (MCP) methodology with greater accuracy is developed. • Three sets of performance metrics are proposed to evaluate the hybrid MCP method. • Both wind speed and direction are considered in the hybrid MCP method. • The best combination of MCP algorithms is determined. • The developed hybrid MCP method is uniquely helpful for long-term wind resource assessment. - Abstract: This paper develops a hybrid measure-correlate-predict (MCP) strategy to assess long-term wind resource variations at a farm site. The hybrid MCP method uses recorded data from multiple reference stations to estimate long-term wind conditions at a target wind plant site with greater accuracy than is possible with data from a single reference station. The weight of each reference station in the hybrid strategy is determined by the (i) distance and (ii) elevation differences between the target farm site and each reference station. In this case, the wind data is divided into sectors according to the wind direction, and the MCP strategy is implemented for each wind direction sector separately. The applicability of the proposed hybrid strategy is investigated using five MCP methods: (i) the linear regression; (ii) the variance ratio; (iii) the Weibull scale; (iv) the artificial neural networks; and (v) the support vector regression. To implement the hybrid MCP methodology, we use hourly averaged wind data recorded at five stations in the state of Minnesota between 07-01-1996 and 06-30-2004. Three sets of performance metrics are used to evaluate the hybrid MCP method. The first set of metrics analyze the statistical performance, including the mean wind speed, wind speed variance, root mean square error, and mean absolute error. The second set of metrics evaluate the distribution of long-term wind speed; to this end, the Weibull distribution and the Multivariate and Multimodal Wind Distribution models are adopted. The third set of metrics analyze

  6. Ethical and methodological issues in qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions: a critical review.

    Science.gov (United States)

    Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika

    2017-01-01

    Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.

  7. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  8. Mitigating cutaneous sensation differences during tDCS: comparing sham versus low intensity control conditions.

    Science.gov (United States)

    Brunyé, Tad T; Cantelon, Julie; Holmes, Amanda; Taylor, Holly A; Mahoney, Caroline R

    2014-01-01

    Cutaneous sensations at electrode sites during the administration of direct current brain stimulation may inadvertently influence participants' subjective experience and task performance. The present study evaluated the utility of a methodological variation that substitutes sham administration with very low intensity (0.5 mA) current delivery. We used a 4 × 1 high-definition ring electrode transcranial direct current (HD-tDCS) system to target the left dorsolateral prefrontal cortex (Brodmann's Area 9). Four stimulation conditions were compared in a repeated-measures design: sham 2.0 mA and 0.5 mA intensity, versus active 2.0 mA and 0.5 mA intensity. During stimulation participants performed a cognitive interference task that activates the cingulo-frontal-parietal network, and periodically provided perceived sensation ratings. We demonstrate that a relatively low intensity control condition attenuates otherwise large differences in perceived sensation between active and sham conditions. Critically, behavioral task differences maintained between the two active conditions. A low intensity control stimulation condition may prove a viable methodological alternative to conventional sham techniques used in repeated-measures designs, though important limitations are discussed. Published by Elsevier Inc.

  9. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  10. Measuring method to impulse neutron scattering background in complicated ambient condition

    International Nuclear Information System (INIS)

    Tang Zhangkui; Peng Taiping; Tang Zhengyuan; Liu Hangang; Hu Mengchun; Fan Juan

    2004-01-01

    This paper introduced a measuring method and calculative formula about impulse neutron scattering background in complicated ambient condition. The experiment had been done in the lab, and the factors to affect measurement conclusion were analysised. (authors)

  11. Demonstration of an infiltration evaluation methodology

    International Nuclear Information System (INIS)

    Smyth, J.D.; Gee, G.W.; Kincaid, C.T.; Nichols, W.M.; Bresler, E.

    1990-07-01

    An Infiltration Evaluation Methodology (IEM) was developed for the US Nuclear Regulatory Commission (NRC) by Pacific Northwest Laboratory (PNL) to provide a consistent, well formulated approach for evaluating drainage through engineered covers at low-level radioactive waste (LLW) sites. The methodology is designed to help evaluate the ability of proposed waste site covers to minimize drainage for LLW site license applications and for sites associated with the Uranium Mill Tailings Remedial Action (UMTRA) program. The objective of this methodology is to estimate the drainage through an engineered burial site cover system. The drainage estimate can be used as an input to a broader performance assessment methodology currently under development by the NRC. The methodology is designed to simulate, at the field scale, significant factors and hydrologic conditions which determine or influence estimates of infiltration, long-term moisture content profiles, and drainage from engineered covers and barriers. The IEM developed under this study acknowledges the uncertainty inherent in soil properties and quantifies the influence of such uncertainty on the estimates of drainage in engineered cover systems at waste disposal sites. 6 refs., 1 fig

  12. From theory to 'measurement' in complex interventions: Methodological lessons from the development of an e-health normalisation instrument

    Directory of Open Access Journals (Sweden)

    Finch Tracy L

    2012-05-01

    Full Text Available Abstract Background Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1 describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2 identify key issues and methodological challenges for advancing work in this field. Methods A 30-item instrument (Technology Adoption Readiness Scale (TARS for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT. NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice was used by health care professionals. Results The developed instrument was pre-tested in two professional samples (N = 46; N = 231. Ratings of items representing normalisation ‘processes’ were significantly related to staff members’ perceptions of whether or not e-health had become ‘routine’. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. Conclusions To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1 greater attention to underlying theoretical assumptions and extent of translation work required; (2 the need for appropriate but flexible approaches to outcomes

  13. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    Science.gov (United States)

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  14. The Role of Condition-Specific Preference-Based Measures in Health Technology Assessment.

    Science.gov (United States)

    Rowen, Donna; Brazier, John; Ara, Roberta; Azzabi Zouraq, Ismail

    2017-12-01

    A condition-specific preference-based measure (CSPBM) is a measure of health-related quality of life (HRQOL) that is specific to a certain condition or disease and that can be used to obtain the quality adjustment weight of the quality-adjusted life-year (QALY) for use in economic models. This article provides an overview of the role and the development of CSPBMs, and presents a description of existing CSPBMs in the literature. The article also provides an overview of the psychometric properties of CSPBMs in comparison with generic preference-based measures (generic PBMs), and considers the advantages and disadvantages of CSPBMs in comparison with generic PBMs. CSPBMs typically include dimensions that are important for that condition but may not be important across all patient groups. There are a large number of CSPBMs across a wide range of conditions, and these vary from covering a wide range of dimensions to more symptomatic or uni-dimensional measures. Psychometric evidence is limited but suggests that CSPBMs offer an advantage in more accurate measurement of milder health states. The mean change and standard deviation can differ for CSPBMs and generic PBMs, and this may impact on incremental cost-effectiveness ratios. CSPBMs have a useful role in HTA where a generic PBM is not appropriate, sensitive or responsive. However, due to issues of comparability across different patient groups and interventions, their usage in health technology assessment is often limited to conditions where it is inappropriate to use a generic PBM or sensitivity analyses.

  15. Methodology for assessing the probability of corrosion in concrete structures on the basis of half-cell potential and concrete resistivity measurements.

    Science.gov (United States)

    Sadowski, Lukasz

    2013-01-01

    In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.

  16. Methodology for Assessing the Probability of Corrosion in Concrete Structures on the Basis of Half-Cell Potential and Concrete Resistivity Measurements

    Directory of Open Access Journals (Sweden)

    Lukasz Sadowski

    2013-01-01

    Full Text Available In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential Ecorr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.

  17. Nondestructive assay methodologies in nuclear forensics analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present chapter, the nondestructive assay (NDA) methodologies used for analysis of nuclear materials as a part of nuclear forensic investigation have been described. These NDA methodologies are based on (i) measurement of passive gamma and neutrons emitted by the radioisotopes present in the nuclear materials, (ii) measurement of gamma rays and neutrons emitted after the active interrogation of the nuclear materials with a source of X-rays, gamma rays or neutrons

  18. Reconstruction of photon number conditioned states using phase randomized homodyne measurements

    International Nuclear Information System (INIS)

    Chrzanowski, H M; Assad, S M; Bernu, J; Hage, B; Lam, P K; Symul, T; Lund, A P; Ralph, T C

    2013-01-01

    We experimentally demonstrate the reconstruction of a photon number conditioned state without using a photon number discriminating detector. By using only phase randomized homodyne measurements, we reconstruct up to the three photon subtracted squeezed vacuum state. The reconstructed Wigner functions of these states show regions of pronounced negativity, signifying the non-classical nature of the reconstructed states. The techniques presented allow for complete characterization of the role of a conditional measurement on an ensemble of states, and might prove useful in systems where photon counting still proves technically challenging. (paper)

  19. Measuring systems of hard to get objects: problems with analysis of measurement results

    Science.gov (United States)

    Gilewska, Grazyna

    2005-02-01

    The problem accessibility of metrological parameters features of objects appeared in many measurements. Especially if it is biological object which parameters very often determined on the basis of indirect research. Accidental component predominate in forming of measurement results with very limited access to measurement objects. Every measuring process has a lot of conditions limiting its abilities to any way processing (e.g. increase number of measurement repetition to decrease random limiting error). It may be temporal, financial limitations, or in case of biological object, small volume of sample, influence measuring tool and observers on object, or whether fatigue effects e.g. at patient. It's taken listing difficulties into consideration author worked out and checked practical application of methods outlying observation reduction and next innovative methods of elimination measured data with excess variance to decrease of mean standard deviation of measured data, with limited aomunt of data and accepted level of confidence. Elaborated methods wee verified on the basis of measurement results of knee-joint width space got from radiographs. Measurements were carried out by indirectly method on the digital images of radiographs. Results of examination confirmed legitimacy to using of elaborated methodology and measurement procedures. Such methodology has special importance when standard scientific ways didn't bring expectations effects.

  20. Study on fermentation conditions of palm juice vinegar by response surface methodology and development of a kinetic model

    Directory of Open Access Journals (Sweden)

    S. Ghosh

    2012-09-01

    Full Text Available Natural vinegar is one of the fermented products which has some potentiality with respect to a nutraceutical standpoint. The present study is an optimization of the fermentation conditions for palm juice vinegar production from palm juice (Borassus flabellifer wine, this biochemical process being aided by Acetobacter aceti (NCIM 2251. The physical parameters of the fermentation conditions such as temperature, pH, and time were investigated by Response Surface Methodology (RSM with 2³ factorial central composite designs (CCD. The optimum pH, temperature and time were 5.5, 30 °C and 72 hrs for the highest yield of acetic acid (68.12 g / L. The quadratic model equation had a R² value of 0.992. RSM played an important role in elucidating the basic mechanisms in a complex situation, thus providing better process control by maximizing acetic acid production with the respective physical parameters. At the optimized conditions of temperature, pH and time and with the help of mathematical kinetic equations, the Monod specific growth rate ( µ max= 0.021 h-1, maximum Logistic specific growth rate ( µ 'max = 0.027 h-1 and various other kinetic parameters were calculated, which helped in validation of the experimental data. Therefore, the established kinetic models may be applied for the production of natural vinegar by fermentation of low cost palm juice.

  1. Application of data mining techniques in the analysis of indoor hygrothermal conditions

    CERN Document Server

    Ramos, Nuno M M; Almeida, Ricardo M S F; Simões, Maria L; Manuel, Sofia

    2016-01-01

    The main benefit of the book is that it explores available methodologies for both conducting in-situ measurements and adequately exploring the results, based on a case study that illustrates the benefits and difficulties of concurrent methodologies. The case study corresponds to a set of 25 social housing dwellings where an extensive in situ measurement campaign was conducted. The dwellings are located in the same quarter of a city. Measurements included indoor temperature and relative humidity, with continuous log in different rooms of each dwelling, blower-door tests and complete outdoor conditions provided by a nearby weather station. The book includes a variety of scientific and engineering disciplines, such as building physics, probability and statistics and civil engineering. It presents a synthesis of the current state of knowledge for benefit of professional engineers and scientists.

  2. The equipment for low radioactivity measurements in industrial and field conditions

    International Nuclear Information System (INIS)

    Malik, R.; Owczarczyk, A.; Szpilowski, S.; Zenczykiewicz, Z.

    1992-01-01

    The equipment for low radioactivity measurements in industrial and field conditions has been worked out. Three scintillation detectors applied work in coincidence system. Their scintillation crystals are divided one to another by lead shieldings. All measuring system is situated in a lead container with lead cover. The measuring vessel fills practically all free volume of the lead container. Their shape ensures the best possible measurement geometry. (author). 3 figs

  3. Local conditional entropy in measure for covers with respect to a fixed partition

    Science.gov (United States)

    Romagnoli, Pierre-Paul

    2018-05-01

    In this paper we introduce two measure theoretical notions of conditional entropy for finite measurable covers conditioned to a finite measurable partition and prove that they are equal. Using this we state a local variational principle with respect to the notion of conditional entropy defined by Misiurewicz (1976 Stud. Math. 55 176–200) for the case of open covers. This in particular extends the work done in Romagnoli (2003 Ergod. Theor. Dynam. Syst. 23 1601–10), Glasner and Weiss (2006 Handbook of Dynamical Systems vol 1B (Amsterdam: Elsevier)) and Huang et al (2006 Ergod. Theor. Dynam. Syst. 26 219–45).

  4. A Novel Approach to Measuring Muscle Mechanics in Vehicle Collision Conditions

    Directory of Open Access Journals (Sweden)

    Simon Krašna

    2017-06-01

    Full Text Available The aim of the study was to evaluate a novel approach to measuring neck muscle load and activity in vehicle collision conditions. A series of sled tests were performed on 10 healthy volunteers at three severity levels to simulate low-severity frontal impacts. Electrical activity—electromyography (EMG—and muscle mechanical tension was measured bilaterally on the upper trapezius. A novel mechanical contraction (MC sensor was used to measure the tension on the muscle surface. The neck extensor loads were estimated based on the inverse dynamics approach. The results showed strong linear correlation (Pearson’s coefficient = 0.821 between the estimated neck muscle load and the muscle tension measured with the MC sensor. The peak of the estimated neck muscle force delayed 0.2 ± 30.6 ms on average vs. the peak MC sensor signal compared to the average delay of 61.8 ± 37.4 ms vs. the peak EMG signal. The observed differences in EMG and MC sensor collected signals indicate that the MC sensor offers an additional insight into the analysis of the neck muscle load and activity in impact conditions. This approach enables a more detailed assessment of the muscle-tendon complex load of a vehicle occupant in pre-impact and impact conditions.

  5. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Science.gov (United States)

    Phillips-Smith, Catherine; Jeong, Cheol-Heon; Healy, Robert M.; Dabek-Zlotorzynska, Ewa; Celo, Valbona; Brook, Jeffrey R.; Evans, Greg

    2017-08-01

    The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter) were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010-November 2012) at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013), hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow, water, and biota samples collected

  6. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Directory of Open Access Journals (Sweden)

    C. Phillips-Smith

    2017-08-01

    Full Text Available The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010–November 2012 at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013, hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow

  7. Efficient solution methodology for calibrating the hemodynamic model using functional Magnetic Resonance Imaging (fMRI) measurements

    KAUST Repository

    Zambri, Brian

    2015-11-05

    Our aim is to propose a numerical strategy for retrieving accurately and efficiently the biophysiological parameters as well as the external stimulus characteristics corresponding to the hemodynamic mathematical model that describes changes in blood flow and blood oxygenation during brain activation. The proposed method employs the TNM-CKF method developed in [1], but in a prediction/correction framework. We present numerical results using both real and synthetic functional Magnetic Resonance Imaging (fMRI) measurements to highlight the performance characteristics of this computational methodology. © 2015 IEEE.

  8. Efficient solution methodology for calibrating the hemodynamic model using functional Magnetic Resonance Imaging (fMRI) measurements

    KAUST Repository

    Zambri, Brian; Djellouli, Rabia; Laleg-Kirati, Taous-Meriem

    2015-01-01

    Our aim is to propose a numerical strategy for retrieving accurately and efficiently the biophysiological parameters as well as the external stimulus characteristics corresponding to the hemodynamic mathematical model that describes changes in blood flow and blood oxygenation during brain activation. The proposed method employs the TNM-CKF method developed in [1], but in a prediction/correction framework. We present numerical results using both real and synthetic functional Magnetic Resonance Imaging (fMRI) measurements to highlight the performance characteristics of this computational methodology. © 2015 IEEE.

  9. Comparison of noise power spectrum methodologies in measurements by using various electronic portal imaging devices in radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Son, Soon Yong [Dept. of Radiological Technology, Wonkwang Health Science University, Iksan (Korea, Republic of); Choi, Kwan Woo [Dept. of Radiology, Asan Medical Center, Seoul (Korea, Republic of); Jeong, Hoi Woun [Dept. of Radiological Technology, Baekseok Culture University College, Cheonan (Korea, Republic of); Kwon, Kyung Tae [Dep. of Radiological Technology, Dongnam Health University, Suwon (Korea, Republic of); Kim, Ki Won [Dept. of Radiology, Kyung Hee University Hospital at Gang-dong, Seoul (Korea, Republic of); Lee, Young Ah; Son, Jin Hyun; Min, Jung Whan [Shingu University College, Sungnam (Korea, Republic of)

    2016-03-15

    The noise power spectrum (NPS) is one of the most general methods for measuring the noise amplitude and the quality of an image acquired from a uniform radiation field. The purpose of this study was to compare different NPS methodologies by using megavoltage X-ray energies. The NPS evaluation methods in diagnostic radiation were applied to therapy using the International Electro-technical Commission standard (IEC 62220-1). Various radiation therapy (RT) devices such as TrueBeamTM(Varian), BEAMVIEWPLUS(Siemens), iViewGT(Elekta) and ClinacR iX (Varian) were used. In order to measure the region of interest (ROI) of the NPS, we used the following four factors: the overlapping impact, the non-overlapping impact, the flatness and penumbra. As for NPS results, iViewGT(Elekta) had the higher amplitude of noise, compared to BEAMVIEWPLUS (Siemens), TrueBeamTM(Varian) flattening filter, ClinacRiXaS1000(Varian) and TrueBeamTM(Varian) flattening filter free. The present study revealed that various factors could be employed to produce megavoltage imaging (MVI) of the NPS and as a baseline standard for NPS methodologies control in MVI.

  10. Guidelines for measuring the physical, chemical, and biological condition of wilderness ecosystems

    Science.gov (United States)

    Douglas G Fox; J. Christopher Bernabo; Betsy Hood

    1987-01-01

    Guidelines include a large number of specific measures to characterize the existing condition of wilderness resources. Measures involve the atmospheric environment, water chemistry and biology, geology and soils, and flora. Where possible, measures are coordinated with existing long-term monitoring programs. Application of the measures will allow more effective...

  11. Optimization of the production conditions of the lipase produced by Bacillus cereus from rice flour through Plackett-Burman Design (PBD) and response surface methodology (RSM).

    Science.gov (United States)

    Vasiee, Alireza; Behbahani, Behrooz Alizadeh; Yazdi, Farideh Tabatabaei; Moradi, Samira

    2016-12-01

    In this study, the screening of lipase positive bacteria from rice flour was carried out by Rhodamin B agar plate method. Bacillus cereus was identified by 16S rDNA method. Screening of the appropriate variables and optimization of the lipase production was performed using Plackett-Burman design (PBD) and response surface methodology (RSM). Among the isolated bacteria, an aerobic Bacillus cereus strain was recognized as the best lipase-producing bacteria (177.3 ± 20 U/ml). Given the results, the optimal enzyme production conditions were achieved with coriander seed extract (CSE)/yeast extract ratio of 16.9 w/w, olive oil (OO) and MgCl 2 concentration of 2.37 g/L and 24.23 mM, respectively. In these conditions, the lipase activity (LA) was predicted 343 U/mL that was approximately close to the predicted value (324 U/mL), which was increased 1.83 fold LA compared with the non-optimized lipase. The kinetic parameters of V max and K m for the lipase were measured 0.367 μM/min.mL and 5.3 mM, respectively. The lipase producing Bacillus cereus was isolated and RSM was used for the optimization of enzyme production. The CSE/yeast extract ratio of 16.9 w/w, OO concentration of 2.37 g/L and MgCl 2 concentration of 24.23 mM, were found to be the optimal conditions of the enzyme production process. LA at optimal enzyme production conditions was observed 1.83 times more than the non-optimal conditions. Ultimately, it can be concluded that the isolated B. cereus from rice flour is a proper source of lipase. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A new metric for measuring condition in large predatory sharks.

    Science.gov (United States)

    Irschick, D J; Hammerschlag, N

    2014-09-01

    A simple metric (span condition analysis; SCA) is presented for quantifying the condition of sharks based on four measurements of body girth relative to body length. Data on 104 live sharks from four species that vary in body form, behaviour and habitat use (Carcharhinus leucas, Carcharhinus limbatus, Ginglymostoma cirratum and Galeocerdo cuvier) are given. Condition shows similar levels of variability among individuals within each species. Carcharhinus leucas showed a positive relationship between condition and body size, whereas the other three species showed no relationship. There was little evidence for strong differences in condition between males and females, although more male sharks are needed for some species (e.g. G. cuvier) to verify this finding. SCA is potentially viable for other large marine or terrestrial animals that are captured live and then released. © 2014 The Fisheries Society of the British Isles.

  13. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    Science.gov (United States)

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information

  14. Technical Report on Preliminary Methodology for Enhancing Risk Monitors with Integrated Equipment Condition Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ramuhalli, Pradeep; Coles, Garill A.; Coble, Jamie B.; Hirt, Evelyn H.

    2013-09-17

    Small modular reactors (SMRs) generally include reactors with electric output of ~350 MWe or less (this cutoff varies somewhat but is substantially less than full-size plant output of 700 MWe or more). Advanced SMRs (AdvSMRs) refer to a specific class of SMRs and are based on modularization of advanced reactor concepts. AdvSMRs may provide a longer-term alternative to traditional light-water reactors (LWRs) and SMRs based on integral pressurized water reactor concepts currently being considered. Enhancing affordability of AdvSMRs will be critical to ensuring wider deployment. AdvSMRs suffer from loss of economies of scale inherent in small reactors when compared to large (~greater than 600 MWe output) reactors. Some of this loss can be recovered through reduced capital costs through smaller size, fewer components, modular fabrication processes, and the opportunity for modular construction. However, the controllable day-to-day costs of AdvSMRs will be dominated by operation and maintenance (O&M) costs. Technologies that help characterize real-time risk are important for controlling O&M costs. Risk monitors are used in current nuclear power plants to provide a point-in-time estimate of the system risk given the current plant configuration (e.g., equipment availability, operational regime, and environmental conditions). However, current risk monitors are unable to support the capability requirements listed above as they do not take into account plant-specific normal, abnormal, and deteriorating states of active components and systems. This report documents technology developments that are a step towards enhancing risk monitors that, if integrated with supervisory plant control systems, can provide the capability requirements listed and meet the goals of controlling O&M costs. The report describes research results from an initial methodology for enhanced risk monitors by integrating real-time information about equipment condition and POF into risk monitors.

  15. Optimizing the conditions for the microwave-assisted direct liquefaction of Ulva prolifera for bio-oil production using response surface methodology

    International Nuclear Information System (INIS)

    Liu, Junhai; Zhuang, Yingbin; Li, Yan; Chen, Limei; Guo, Jingxue; Li, Demao; Ye, Naihao

    2013-01-01

    Microwave-assisted direct liquefaction (MADL) of Ulva prolifera was performed in ethylene glycol (EG) using sulfuric acid (H 2 SO 4 ) as a catalyst. Response Surface Methodology (RSM) based on central composite rotatable design (CCRD) was employed to optimize the conditions of three independent variables (catalyst content, solvent-to-feedstock ratio and temperature) for the liquefaction yield. And the bio-oil was analyzed by elementary analysis, Fourier transform infrared spectroscopic analysis (FT-IR) and gas chromatography–mass spectrometry (GC–MS). The maximum liquefaction yield was 93.17%, which was obtained under a microwave power of 600 W for 30 min at 165 °C with a solvent-to-feedstock ratio of 18.87:1 and 4.93% sulfuric acid. The bio-oil was mainly composed of phthalic acid esters, alkenes and a fatty acid methyl ester with a long chain from C 16 to C 20 . - Highlights: • Ulva prolifera was converted to bio-oil through microwave-assisted direct liquefaction. • Response surface methodology was used to optimize the liquefaction technology. • A maximum liquefaction rate of 93.17 wt% bio-oil was obtained. • The bio-oil was composed of carboxylic acids and esters

  16. Methodological issues in systematic reviews of headache trials: adapting historical diagnostic classifications and outcome measures to present-day standards.

    Science.gov (United States)

    McCrory, Douglas C; Gray, Rebecca N; Tfelt-Hansen, Peer; Steiner, Timothy J; Taylor, Frederick R

    2005-05-01

    Recent efforts to make headache diagnostic classification and clinical trial methodology more consistent provide valuable advice to trialists generating new evidence on effectiveness of treatments for headache; however, interpreting older trials that do not conform to new standards remains problematic. Systematic reviewers seeking to utilize historical data can adapt currently recommended diagnostic classification and clinical trial methodological approaches to interpret all available data relative to current standards. In evaluating study populations, systematic reviewers can: (i) use available data to attempt to map study populations to diagnoses in the new International Classification of Headache Disorders; and (ii) stratify analyses based on the extent to which study populations are precisely specified. In evaluating outcome measures, systematic reviewers can: (i) summarize prevention studies using headache frequency, incorporating headache index in a stratified analysis if headache frequency is not available; (ii) summarize acute treatment studies using pain-free response as reported in directly measured headache improvement or headache severity outcomes; and (iii) avoid analysis of recurrence or relapse data not conforming to the sustained pain-free response definition.

  17. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  18. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  19. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    Energy Technology Data Exchange (ETDEWEB)

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and

  20. Social cognition interventions for people with schizophrenia: a systematic review focussing on methodological quality and intervention modality.

    Science.gov (United States)

    Grant, Nina; Lawrence, Megan; Preti, Antonio; Wykes, Til; Cella, Matteo

    2017-08-01

    People with a diagnosis of schizophrenia have significant social and functional difficulties. Social cognition was found to influences these outcomes and in recent years interventions targeting this domain were developed. This paper reviews the existing literature on social cognition interventions for people with a diagnosis of schizophrenia focussing on: i) comparing focussed (i.e. targeting only one social cognitive domain) and global interventions and ii) studies methodological quality. Systematic search was conducted on PubMed and PsycInfo. Studies were included if they were randomised control trials, participants had a diagnosis of schizophrenia or schizoaffective disorder, and the intervention targeted at least one out of four social cognition domains (i.e. theory of mind, affect recognition, social perception and attribution bias). All papers were assessed for methodological quality. Information on the intervention, control condition, study methodology and the main findings from each study were extracted and critically summarised. Data from 32 studies fulfilled the inclusion criteria, considering a total of 1440 participants. Taking part in social cognition interventions produced significant improvements in theory of mind and affect recognition compared to both passive and active control conditions. Results were less clear for social perception and attributional bias. Focussed and global interventions had similar results on outcomes. Overall study methodological quality was modest. There was very limited evidence showing that social cognitive intervention result in functional outcome improvement. The evidence considered suggests that social cognition interventions may be a valuable approach for people with a diagnosis of schizophrenia. However, evidence quality is limited by measure heterogeneity, modest study methodology and short follow-up periods. The findings point to a number of recommendations for future research, including measurement standardisation

  1. Conditioning a segmented stem profile model for two diameter measurements

    Science.gov (United States)

    Raymond L. Czaplewski; Joe P. Mcclure

    1988-01-01

    The stem profile model of Max and Burkhart (1976) is conditioned for dbh and a second upper stem measurement. This model was applied to a loblolly pine data set using diameter outside bark at 5.3m (i.e., height of 17.3 foot Girard form class) as the second upper stem measurement, and then compared to the original, unconditioned model. Variance of residuals was reduced...

  2. Development of Assessment Methodology of Chemical Behavior of Volatile Iodide under Severe Accident Conditions Using EPICUR Experiments

    International Nuclear Information System (INIS)

    Oh, Jae Yong; Yun, Jong Il; Kim, Do Sam; Han Chul

    2011-01-01

    Iodine is one of the most important fission products produced in nuclear power plants. Under severe accident condition, iodine exists as a variety of species in the containment such as aqueous iodide, gaseous iodide, iodide aerosol, etc. Following release of iodine from the reactor, mostly in the form of CsI aerosol, volatile iodine can be generated from the containment sump and release to the environment. Especially, volatile organic iodide can be produced from interaction between nonvolatile iodine and organic substances present in the containment. Volatile iodide could significantly influence the alienated residents surrounding the nuclear power plant. In particular, thyroid is vulnerable to radioiodine due to its high accumulation. Therefore, it is necessary for the Korea Institute of Nuclear Safety (KINS) to develop an evaluation model which can simulate iodine behavior in the containment following a severe accident. KINS also needs to make up its methodology for radiological consequence analysis, based on MELCOR-MACCS2 calculation, by coupling a simple iodine model which can conveniently deal with organic iodides. In the long term, such a model can contribute to develop an accident source term, which is one of urgent domestic needs. Our strategy for developing the model is as follows: 1. Review the existing methodologies, 2. Develop a simple stand-alone model, 3. Validate the model using ISTP-EPICUR (Experimental Program on Iodine Chemistry under Radiation) and OECD-BIP (Behavior of Iodine Project) experimental data. In this paper we present the context of development and validation of our model named RAIM (Radio-active iodine chemistry model)

  3. Development of Assessment Methodology of Chemical Behavior of Volatile Iodide under Severe Accident Conditions Using EPICUR Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jae Yong; Yun, Jong Il [KAIST, Daejeon (Korea, Republic of); Kim, Do Sam; Han Chul [Korea Institue of Nuclear Safety, Daejeon (Korea, Republic of)

    2011-05-15

    Iodine is one of the most important fission products produced in nuclear power plants. Under severe accident condition, iodine exists as a variety of species in the containment such as aqueous iodide, gaseous iodide, iodide aerosol, etc. Following release of iodine from the reactor, mostly in the form of CsI aerosol, volatile iodine can be generated from the containment sump and release to the environment. Especially, volatile organic iodide can be produced from interaction between nonvolatile iodine and organic substances present in the containment. Volatile iodide could significantly influence the alienated residents surrounding the nuclear power plant. In particular, thyroid is vulnerable to radioiodine due to its high accumulation. Therefore, it is necessary for the Korea Institute of Nuclear Safety (KINS) to develop an evaluation model which can simulate iodine behavior in the containment following a severe accident. KINS also needs to make up its methodology for radiological consequence analysis, based on MELCOR-MACCS2 calculation, by coupling a simple iodine model which can conveniently deal with organic iodides. In the long term, such a model can contribute to develop an accident source term, which is one of urgent domestic needs. Our strategy for developing the model is as follows: 1. Review the existing methodologies, 2. Develop a simple stand-alone model, 3. Validate the model using ISTP-EPICUR (Experimental Program on Iodine Chemistry under Radiation) and OECD-BIP (Behavior of Iodine Project) experimental data. In this paper we present the context of development and validation of our model named RAIM (Radio-active iodine chemistry model)

  4. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  5. Measuring intracellular redox conditions using GFP-based sensors

    DEFF Research Database (Denmark)

    Björnberg, Olof; Ostergaard, Henrik; Winther, Jakob R

    2006-01-01

    Recent years have seen the development of methods for analyzing the redox conditions in specific compartments in living cells. These methods are based on genetically encoded sensors comprising variants of Green Fluorescent Protein in which vicinal cysteine residues have been introduced at solvent......-exposed positions. Several mutant forms have been identified in which formation of a disulfide bond between these cysteine residues results in changes of their fluorescence properties. The redox sensors have been characterized biochemically and found to behave differently, both spectroscopically and in terms...... of redox properties. As genetically encoded sensors they can be expressed in living cells and used for analysis of intracellular redox conditions; however, which parameters are measured depends on how the sensors interact with various cellular redox components. Results of both biochemical and cell...

  6. Methodology for dynamic biaxial tension testing of pregnant uterine tissue.

    Science.gov (United States)

    Manoogian, Sarah; Mcnally, Craig; Calloway, Britt; Duma, Stefan

    2007-01-01

    Placental abruption accounts for 50% to 70% of fetal losses in motor vehicle crashes. Since automobile crashes are the leading cause of traumatic fetal injury mortality in the United States, research of this injury mechanism is important. Before research can adequately evaluate current and future restraint designs, a detailed model of the pregnant uterine tissues is necessary. The purpose of this study is to develop a methodology for testing the pregnant uterus in biaxial tension at a rate normally seen in a motor vehicle crash. Since the majority of previous biaxial work has established methods for quasi-static testing, this paper combines previous research and new methods to develop a custom designed system to strain the tissue at a dynamic rate. Load cells and optical markers are used for calculating stress strain curves of the perpendicular loading axes. Results for this methodology show images of a tissue specimen loaded and a finite verification of the optical strain measurement. The biaxial test system dynamically pulls the tissue to failure with synchronous motion of four tissue grips that are rigidly coupled to the tissue specimen. The test device models in situ loading conditions of the pregnant uterus and overcomes previous limitations of biaxial testing. A non-contact method of measuring strains combined with data reduction to resolve the stresses in two directions provides the information necessary to develop a three dimensional constitutive model of the material. Moreover, future research can apply this method to other soft tissues with similar in situ loading conditions.

  7. Combining tracer flux ratio methodology with low-flying aircraft measurements to estimate dairy farm CH4 emissions

    Science.gov (United States)

    Daube, C.; Conley, S.; Faloona, I. C.; Yacovitch, T. I.; Roscioli, J. R.; Morris, M.; Curry, J.; Arndt, C.; Herndon, S. C.

    2017-12-01

    Livestock activity, enteric fermentation of feed and anaerobic digestion of waste, contributes significantly to the methane budget of the United States (EPA, 2016). Studies question the reported magnitude of these methane sources (Miller et. al., 2013), calling for more detailed research of agricultural animals (Hristov, 2014). Tracer flux ratio is an attractive experimental method to bring to this problem because it does not rely on estimates of atmospheric dispersion. Collection of data occurred during one week at two dairy farms in central California (June, 2016). Each farm varied in size, layout, head count, and general operation. The tracer flux ratio method involves releasing ethane on-site with a known flow rate to serve as a tracer gas. Downwind mixed enhancements in ethane (from the tracer) and methane (from the dairy) were measured, and their ratio used to infer the unknown methane emission rate from the farm. An instrumented van drove transects downwind of each farm on public roads while tracer gases were released on-site, employing the tracer flux ratio methodology to assess simultaneous methane and tracer gas plumes. Flying circles around each farm, a small instrumented aircraft made measurements to perform a mass balance evaluation of methane gas. In the course of these two different methane quantification techniques, we were able to validate yet a third method: tracer flux ratio measured via aircraft. Ground-based tracer release rates were applied to the aircraft-observed methane-to-ethane ratios, yielding whole-site methane emission rates. Never before has the tracer flux ratio method been executed with aircraft measurements. Estimates from this new application closely resemble results from the standard ground-based technique to within their respective uncertainties. Incorporating this new dimension to the tracer flux ratio methodology provides additional context for local plume dynamics and validation of both ground and flight-based data.

  8. Tolerance limits and methodologies for IMRT measurement-based verification QA: Recommendations of AAPM Task Group No. 218.

    Science.gov (United States)

    Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A

    2018-04-01

    Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA

  9. The professional methodological teaching performance of the professor of Physical education. Set of parameters for its measurement

    Directory of Open Access Journals (Sweden)

    Orlando Pedro Suárez Pérez

    2017-07-01

    Full Text Available This work was developed due to the need to attend to the difficulties found in the Physical Education teachers of the municipality of San Juan and Martínez during the development of the teaching-learning process of Basketball, which threaten the quality of the classes, sports results and preparation of the School for life. The objective is to propose parameters that allow measuring the professional teaching methodological performance of these teachers. The customized behavior of the research made possible the diagnosis of the 26 professors taken as a sample, expressing the traits that distinguish their efficiency, determining their potentialities and deficiencies. During the research process, theoretical, empirical and statistical methods were used, which permitted to corroborate the real existence of the problem, as well as the evaluation of its impact, which revealed a positive transformation in pedagogical practice. The results provide a concrete and viable answer for the improvement of the evaluation of the teaching-methodological component of the Physical Education teacher, which constitutes an important material of guidance for methodologists and managers related to the instrumental cognitive, procedural and attitudinal performance , In order to conduct from the precedent knowledge, the new knowledge and lead to a formative process, with a contemporary vision, offering methodological resources to control the quality of Physical Education lessons.

  10. Code coverage measurement methodology for MMI software of safety-class I and C system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eun Hyung; Jung, Beom Young; Choi, Seok Joo [Suresofttech, Seoul (Korea, Republic of)

    2016-10-15

    MMI (Man-Machine Interface) software of the safety instrumentation and control system used in nuclear power plants carry out an important functions, such as displaying and transmitting the commend to another system, and change setpoints the safety-related information. Yet, this has been recognized reliability of the MMI software plays an important role in enhancing nuclear power plants are operating, regulatory standards have been strengthened with it. Strengthening of regulatory standards has affected even perform software testing soon, and accordingly, the current regulatory require the measurement of code coverage with legal standard. In this paper, it poses a problem of the conventional method used for measuring the above-mentioned code coverage, presents a new coverage measuring method for solving the exposed problems. In this paper, we checked the problems such as limit and the low efficiency of the existing test coverage measuring method on the MMI software using in nuclear power instrumentation and control systems, and it proposed a new test coverage measuring method as a solution for this. If you apply a new method of Top-Down approach, can mitigate all of the problems of existing test coverage measurement methods and possible coverage achievement of the desired objectives. Of course, it is still necessary to secure more cases, and the methodology should be systematization based on the cases. Thus, if later the efficient and reliable are ensured through the application in many cases, as well as nuclear power instrumentation and control, may be used to ensure code coverage of software of the many areas where the GUI is utilized.

  11. Improved optimum condition for recovery and measurement of 210 ...

    African Journals Online (AJOL)

    The aim of this study was to determine the optimum conditions for deposition of 210Po and evaluate the accuracy and precision of the results for its determination in environmental samples. To improve the technique for measurement of polonium-210(210Po) in environmental samples. The optimization of five factors (volume ...

  12. Methodology to assess coastal infrastructure resilience to climate change

    Directory of Open Access Journals (Sweden)

    Roca Marta

    2016-01-01

    In order to improve the resilience of the line, several options have been considered to evaluate and reduce climate change impacts to the railway. This paper describes the methodological approach developed to evaluate the risks of flooding for a range of scenarios in the estuary and open coast reaches of the line. Components to derive the present day and future climate change coastal conditions including some possible adaptation measures are also presented together with the results of the hindcasting analysis to assess the performance of the modelling system. An overview of the modelling results obtained to support the development of a long-term Resilience Strategy for asset management is also discussed.

  13. Mo(ve)ment-methodology

    DEFF Research Database (Denmark)

    Mørck, Line Lerche; Christian Celosse-Andersen, Martin

    2018-01-01

    This paper describes the theoretical basis for and development of a moment-movement research methodology, based on an integration of critical psychological practice research and critical ethnographic social practice theory. Central theoretical conceptualizations, such as human agency, life...... conditions and identity formation, are discussed in relation to criminological theories of gang desistance. The paper illustrates how the mo(ve)ment methodology was applied in a study of comprehensive processes of identity (re)formation and gang exit processes. This study was conducted with Martin, a former....... This is a moment which captures Martin’s complex and ambiguous feelings of conflictual concerns, frustration, anger, and a new feeling of insecurity in his masculinity, as well as engagement and a sense of deep meaningfulness as he becomes a more reflective academic. All these conflicting feelings also give...

  14. The application of conditioning paradigms in the measurement of pain.

    Science.gov (United States)

    Li, Jun-Xu

    2013-09-15

    Pain is a private experience that involves both sensory and emotional components. Animal studies of pain can only be inferred by their responses, and therefore the measurement of reflexive responses dominates the pain literature for nearly a century. It has been argued that although reflexive responses are important to unveil the sensory nature of pain in organisms, pain affect is equally important but largely ignored in pain studies primarily due to the lack of validated animal models. One strategy to begin to understand pain affect is to use conditioning principles to indirectly reveal the affective condition of pain. This review critically analyzed several procedures that are thought to measure affective learning of pain. The procedures regarding the current knowledge, the applications, and their advantages and disadvantages in pain research are discussed. It is proposed that these procedures should be combined with traditional reflex-based pain measurements in future studies of pain, which could greatly benefit both the understanding of neural underpinnings of pain and preclinical assessment of novel analgesics. © 2013 Elsevier B.V. All rights reserved.

  15. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  16. Adaptation of EVIAVE methodology for monitoring and follow-up when evaluating the environmental impact of landfills

    International Nuclear Information System (INIS)

    Arrieta, Gabriela; Requena, Ignacio; Toro, Javier; Zamorano, Montserrat

    2016-01-01

    Treatment and final disposal of Municipal Solid Waste can have a significant role in the generation of negative environmental impacts. As a prevention strategy, such activities are subjected to the process of Environmental Impact Assessment (EIA). Still, the follow-up of Environmental Management Plans or mitigation measures is limited, for one due to a lack of methodological approaches. In searching for possibilities, the University of Granada (Spain) developed a diagnostic methodology named EVIAVE, which allows one to quantify, by means of indexes, the environmental impact of landfills in view of their location and the conditions of exploitation. EVIAVE is applicable within the legal framework of the European Union and can be adapted to the environmental and legal conditions of other countries. This study entails its adaptation in Colombia, for the follow-up and control of the EIA process for landfills. Modifications involved inclusion of the environmental elements flora and fauna, and the evaluation of the environmental descriptors in agreement with the concept of vulnerability. The application of the modified EVIAVE in Colombian landfills allowed us to identify the elements affected by the operating conditions and maintenance. It may be concluded that this methodology is viable and effective for the follow-up and environmental control of EIA processes for landfills, and to analyze the associated risks, as it takes into account related environmental threats and vulnerabilities. - Highlights: • A modified methodology is used to monitor and follow-up environmental impacts in landfills. • The improved methodology includes the Vulnerability of Flora and Fauna to evaluate environmental impact of landfills. • The methodology serves to identify and evaluate the sources of risk generated in the construction and siting of landfills. • Environmental vulnerability indicators improve effectiveness of the control and follow-up phases of landfill management. • The

  17. Adaptation of EVIAVE methodology for monitoring and follow-up when evaluating the environmental impact of landfills

    Energy Technology Data Exchange (ETDEWEB)

    Arrieta, Gabriela, E-mail: tonina1903@hotmail.com [Department of Civil Engineering, University of Granada (Spain); Requena, Ignacio, E-mail: requena@decsai.ugr.es [Department of Computer Science and Artificial Intelligence, University of Granada (Spain); Toro, Javier, E-mail: jjtoroca@unal.edu.co [Universidad Nacional de Colombia — Sede Bogotá, Instituto de Estudios Ambientales (Colombia); Zamorano, Montserrat, E-mail: zamorano@ugr.es [Department of Civil Engineering, University of Granada (Spain)

    2016-01-15

    Treatment and final disposal of Municipal Solid Waste can have a significant role in the generation of negative environmental impacts. As a prevention strategy, such activities are subjected to the process of Environmental Impact Assessment (EIA). Still, the follow-up of Environmental Management Plans or mitigation measures is limited, for one due to a lack of methodological approaches. In searching for possibilities, the University of Granada (Spain) developed a diagnostic methodology named EVIAVE, which allows one to quantify, by means of indexes, the environmental impact of landfills in view of their location and the conditions of exploitation. EVIAVE is applicable within the legal framework of the European Union and can be adapted to the environmental and legal conditions of other countries. This study entails its adaptation in Colombia, for the follow-up and control of the EIA process for landfills. Modifications involved inclusion of the environmental elements flora and fauna, and the evaluation of the environmental descriptors in agreement with the concept of vulnerability. The application of the modified EVIAVE in Colombian landfills allowed us to identify the elements affected by the operating conditions and maintenance. It may be concluded that this methodology is viable and effective for the follow-up and environmental control of EIA processes for landfills, and to analyze the associated risks, as it takes into account related environmental threats and vulnerabilities. - Highlights: • A modified methodology is used to monitor and follow-up environmental impacts in landfills. • The improved methodology includes the Vulnerability of Flora and Fauna to evaluate environmental impact of landfills. • The methodology serves to identify and evaluate the sources of risk generated in the construction and siting of landfills. • Environmental vulnerability indicators improve effectiveness of the control and follow-up phases of landfill management. • The

  18. Electrochemical noise measurements under pressurized water reactor conditions

    International Nuclear Information System (INIS)

    Van Nieuwenhove, R.

    2000-01-01

    Electrochemical potential noise measurements on sensitized stainless steel pressure tubes under pressurized water reactor (PWR) conditions were performed for the first time. Very short potential spikes, believed to be associated to crack initiation events, were detected when stressing the sample above the yield strength and increased in magnitude until the sample broke. Sudden increases of plastic deformation, as induced by an increased tube pressure, resulted in slower, high-amplitude potential transients, often accompanied by a reduction in noise level

  19. Investigation of Seepage Meter Measurements in Steady Flow and Wave Conditions.

    Science.gov (United States)

    Russoniello, Christopher J; Michael, Holly A

    2015-01-01

    Water exchange between surface water and groundwater can modulate or generate ecologically important fluxes of solutes across the sediment-water interface. Seepage meters can directly measure fluid flux, but mechanical resistance and surface water dynamics may lead to inaccurate measurements. Tank experiments were conducted to determine effects of mechanical resistance on measurement efficiency and occurrence of directional asymmetry that could lead to erroneous net flux measurements. Seepage meter efficiency was high (average of 93%) and consistent for inflow and outflow under steady flow conditions. Wave effects on seepage meter measurements were investigated in a wave flume. Seepage meter net flux measurements averaged 0.08 cm/h-greater than the expected net-zero flux, but significantly less than theoretical wave-driven unidirectional discharge or recharge. Calculations of unidirectional flux from pressure measurements (Darcy flux) and theory matched well for a ratio of wave length to water depth less than 5, but not when this ratio was greater. Both were higher than seepage meter measurements of unidirectional flux made with one-way valves. Discharge averaged 23% greater than recharge in both seepage meter measurements and Darcy calculations of unidirectional flux. Removal of the collection bag reduced this net discharge. The presence of a seepage meter reduced the amplitude of pressure signals at the bed and resulted in a nearly uniform pressure distribution beneath the seepage meter. These results show that seepage meters may provide accurate measurements of both discharge and recharge under steady flow conditions and illustrate the potential measurement errors associated with dynamic wave environments. © 2014, National Ground Water Association.

  20. Measuring Instruments Control Methodology Performance for Analog Electronics Remote Labs

    Directory of Open Access Journals (Sweden)

    Unai Hernandez-Jayo

    2012-12-01

    Full Text Available This paper presents the work that has been developed in parallel to the VISIR project. The objective of this paper is to present the results of the validations processes that have been carried out to check the control methodology. This method has been developed with the aim of being independent of the instruments of the labs.

  1. Novel Methods for Optically Measuring Whitecaps Under Natural Wave Breaking Conditions in the Southern Ocean

    Science.gov (United States)

    Randolph, K. L.; Dierssen, H. M.; Cifuentes-Lorenzen, A.; Balch, W. M.; Monahan, E. C.; Zappa, C. J.; Drapeau, D.; Bowler, B.

    2016-02-01

    Breaking waves on the ocean surface mark areas of significant importance to air-sea flux estimates of gas, aerosols, and heat. Traditional methods of measuring whitecap coverage using digital photography can miss features that are small in size or do not show high enough contrast to the background. The geometry of the images collected captures the near surface, bright manifestations of the whitecap feature and miss a portion of the bubble plume that is responsible for the production of sea salt aerosols and the transfer of lower solubility gases. Here, a novel method for accurately measuring both the fractional coverage of whitecaps and the intensity and decay rate of whitecap events using above water radiometry is presented. The methodology was developed using data collected during the austral summer in the Atlantic sector of the Southern Ocean under a large range of wind (speeds of 1 to 15 m s-1) and wave (significant wave heights 2 to 8 m) conditions as part of the Southern Ocean Gas Exchange experiment. Whitecap metrics were retrieved by employing a magnitude threshold based on the interquartile range of the radiance or reflectance signal for a single channel (411 nm) after a baseline removal, determined using a moving minimum/maximum filter. Breaking intensity and decay rate metrics were produced from the integration of, and the exponential fit to, radiance or reflectance over the lifetime of the whitecap. When compared to fractional whitecap coverage measurements obtained from high resolution digital images, radiometric estimates were consistently higher because they capture more of the decaying bubble plume area that is difficult to detect with photography. Radiometrically-retrieved whitecap measurements are presented in the context of concurrently measured meteorological (e.g., wind speed) and oceanographic (e.g., wave) data. The optimal fit of the radiometrically estimated whitecap coverage to the instantaneous wind speed, determined using ordinary least

  2. Methodology of high-resolution photography for mural condition database

    Science.gov (United States)

    Higuchi, R.; Suzuki, T.; Shibata, M.; Taniguchi, Y.

    2015-08-01

    Digital documentation is one of the most useful techniques to record the condition of cultural heritage. Recently, high-resolution images become increasingly useful because it is possible to show general views of mural paintings and also detailed mural conditions in a single image. As mural paintings are damaged by environmental stresses, it is necessary to record the details of painting condition on high-resolution base maps. Unfortunately, the cost of high-resolution photography and the difficulty of operating its instruments and software have commonly been an impediment for researchers and conservators. However, the recent development of graphic software makes its operation simpler and less expensive. In this paper, we suggest a new approach to make digital heritage inventories without special instruments, based on our recent our research project in Üzümlü church in Cappadocia, Turkey. This method enables us to achieve a high-resolution image database with low costs, short time, and limited human resources.

  3. Measurement of heat stress conditions at cow level and comparison to climate conditions at stationary locations inside a dairy barn.

    Science.gov (United States)

    Schüller, Laura K; Heuwieser, Wolfgang

    2016-08-01

    The objectives of this study were to examine heat stress conditions at cow level and to investigate the relationship to the climate conditions at 5 different stationary locations inside a dairy barn. In addition, we compared the climate conditions at cow level between primiparous and multiparous cows for a period of 1 week after regrouping. The temperature-humidity index (THI) differed significantly between all stationary loggers. The lowest THI was measured at the window logger in the experimental stall and the highest THI was measured at the central logger in the experimental stall. The THI at the mobile cow loggers was 2·33 THI points higher than at the stationary loggers. Furthermore, the mean daily THI was higher at the mobile cow loggers than at the stationary loggers on all experimental days. The THI in the experimental pen was 0·44 THI points lower when the experimental cow group was located inside the milking parlour. The THI measured at the mobile cow loggers was 1·63 THI points higher when the experimental cow group was located inside the milking parlour. However, there was no significant difference for all climate variables between primiparous and multiparous cows. These results indicate, there is a wide range of climate conditions inside a dairy barn and especially areas with a great distance to a fresh air supply have an increased risk for the occurrence of heat stress conditions. Furthermore, the heat stress conditions are even higher at cow level and cows not only influence their climatic environment, but also generate microclimates within different locations inside the barn. Therefore climate conditions should be obtained at cow level to evaluate the heat stress conditions that dairy cows are actually exposed to.

  4. A Thermographic Measurement Approach to Assess Supercapacitor Electrical Performances

    Directory of Open Access Journals (Sweden)

    Stanislaw Galla

    2017-12-01

    Full Text Available This paper describes a proposal for the qualitative assessment of condition of supercapacitors based on the conducted thermographic measurements. The presented measurement stand was accompanied by the concept of methodology of performing tests. Necessary conditions, which were needed to minimize the influence of disturbing factors on the performance of thermal imaging measurements, were also indicated. Mentioned factors resulted from both: the hardware limitations and from the necessity to prepare samples. The algorithm that was used to determine the basic parameters for assessment has been presented. The article suggests to use additional factors that may facilitate the analysis of obtained results. Measuring the usefulness of the proposed methodology was tested on commercial samples of supercapacitors. All of the tests were taken in conjunction with the classical methods based on capacitance (C and equivalent series resistance (ESR measurements, which were also presented in the paper. Selected results presenting the observed changes occurring in both: basic parameters of supercapacitors and accompanying fluctuations of thermal fields, along with analysis, were shown. The observed limitations of the proposed assessment method and the suggestions for its development were also described.

  5. Fiber-Optic Temperature and Pressure Sensors Applied to Radiofrequency Thermal Ablation in Liver Phantom: Methodology and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Daniele Tosi

    2015-01-01

    Full Text Available Radiofrequency thermal ablation (RFA is a procedure aimed at interventional cancer care and is applied to the treatment of small- and midsize tumors in lung, kidney, liver, and other tissues. RFA generates a selective high-temperature field in the tissue; temperature values and their persistency are directly related to the mortality rate of tumor cells. Temperature measurement in up to 3–5 points, using electrical thermocouples, belongs to the present clinical practice of RFA and is the foundation of a physical model of the ablation process. Fiber-optic sensors allow extending the detection of biophysical parameters to a vast plurality of sensing points, using miniature and noninvasive technologies that do not alter the RFA pattern. This work addresses the methodology for optical measurement of temperature distribution and pressure using four different fiber-optic technologies: fiber Bragg gratings (FBGs, linearly chirped FBGs (LCFBGs, Rayleigh scattering-based distributed temperature system (DTS, and extrinsic Fabry-Perot interferometry (EFPI. For each instrument, methodology for ex vivo sensing, as well as experimental results, is reported, leading to the application of fiber-optic technologies in vivo. The possibility of using a fiber-optic sensor network, in conjunction with a suitable ablation device, can enable smart ablation procedure whereas ablation parameters are dynamically changed.

  6. Energy index decomposition methodology at the plant level

    Science.gov (United States)

    Kumphai, Wisit

    Scope and method of study. The dissertation explores the use of a high level energy intensity index as a facility-level energy performance monitoring indicator with a goal of developing a methodology for an economically based energy performance monitoring system that incorporates production information. The performance measure closely monitors energy usage, production quantity, and product mix and determines the production efficiency as a part of an ongoing process that would enable facility managers to keep track of and, in the future, be able to predict when to perform a recommissioning process. The study focuses on the use of the index decomposition methodology and explored several high level (industry, sector, and country levels) energy utilization indexes, namely, Additive Log Mean Divisia, Multiplicative Log Mean Divisia, and Additive Refined Laspeyres. One level of index decomposition is performed. The indexes are decomposed into Intensity and Product mix effects. These indexes are tested on a flow shop brick manufacturing plant model in three different climates in the United States. The indexes obtained are analyzed by fitting an ARIMA model and testing for dependency between the two decomposed indexes. Findings and conclusions. The results concluded that the Additive Refined Laspeyres index decomposition methodology is suitable to use on a flow shop, non air conditioned production environment as an energy performance monitoring indicator. It is likely that this research can be further expanded in to predicting when to perform a recommissioning process.

  7. Measurement of Survival Time in Brachionus Rotifers: Synchronization of Maternal Conditions.

    Science.gov (United States)

    Kaneko, Gen; Yoshinaga, Tatsuki; Gribble, Kristin E; Welch, David M; Ushio, Hideki

    2016-07-22

    Rotifers are microscopic cosmopolitan zooplankton used as models in ecotoxicological and aging studies due to their several advantages such as short lifespan, ease of culture, and parthenogenesis that enables clonal culture. However, caution is required when measuring their survival time as it is affected by maternal age and maternal feeding conditions. Here we provide a protocol for powerful and reproducible measurement of the survival time in Brachionus rotifers following a careful synchronization of culture conditions over several generations. Empirically, poor synchronization results in early mortality and a gradual decrease in survival rate, thus resulting in weak statistical power. Indeed, under such conditions, calorie restriction (CR) failed to significantly extend the lifespan of B. plicatilis although CR-induced longevity has been demonstrated with well-synchronized rotifer samples in past and present studies. This protocol is probably useful for other invertebrate models, including the fruitfly Drosophila melanogaster and the nematode Caenorhabditis elegans, because maternal age effects have also been reported in these species.

  8. A review on human reinstatement studies: an overview and methodological challenges.

    Science.gov (United States)

    Haaker, Jan; Golkar, Armita; Hermans, Dirk; Lonsdorf, Tina B

    2014-09-01

    In human research, studies of return of fear (ROF) phenomena, and reinstatement in particular, began only a decade ago and recently are more widely used, e.g., as outcome measures for fear/extinction memory manipulations (e.g., reconsolidation). As reinstatement research in humans is still in its infancy, providing an overview of its stability and boundary conditions and summarizing methodological challenges is timely to foster fruitful future research. As a translational endeavor, clarifying the circumstances under which (experimental) reinstatement occurs may offer a first step toward understanding relapse as a clinical phenomenon and pave the way for the development of new pharmacological or behavioral ways to prevent ROF. The current state of research does not yet allow pinpointing these circumstances in detail and we hope this review will aid the research field to advance in this direction. As an introduction, we begin with a synopsis of rodent work on reinstatement and theories that have been proposed to explain the findings. The review however mainly focuses on reinstatement in humans. We first describe details and variations of the experimental setup in reinstatement studies in humans and give a general overview of results. We continue with a compilation of possible experimental boundary conditions and end with the role of individual differences and behavioral and/or pharmacological manipulations. Furthermore, we compile important methodological and design details on the published studies in humans and end with open research questions and some important methodological and design recommendations as a guide for future research. © 2014 Haaker et al.; Published by Cold Spring Harbor Laboratory Press.

  9. Experimental measurement of compressibility coefficients of synthetic sandstone in hydrostatic conditions

    International Nuclear Information System (INIS)

    Asaei, H; Moosavi, M

    2013-01-01

    For the characterization of the mechanical behavior of porous media in elastic conditions, the theory of poroelasticity is used. The number of poroelastic coefficients is greater in elastic conditions because of the complexity of porous media. The laboratory measurement of poroelastic coefficients needs a system that can control and measure the variables of poroelasticity. In this paper, experimental measurements of these coefficients are presented. Laboratory tests are performed using a system designed by the authors. Laboratory hydrostatic tests are performed on cylindrical samples in drained, pore pressure loading, undrained and dry conditions. Compressibilities (bulk and pore compressibility), effective stress and Skempton coefficients are measured by these tests. Samples are made of a composition (sand and cement) and are made by a compaction process synthetically. Calibration tests are performed for the setup to identify possible errors in the system and to correct the results of the main tests. This is done by performing similar compressibility tests at each stress level on a cylindrical steel sample (5.47 mm in diameter) with a longitudinal hole along it (hollow cylinder). A steel sample is used to assume an incompressible sample. The results of the tests are compared with the theory of poroelasticity and the obtained graphs and their errors are analyzed. This study shows that the results of the drained and pore pressure loading tests are compatible with poroelastic formulation, while the undrained results have errors because of extra fluid volume in the pore pressure system and calibration difficulties. (paper)

  10. Methodological concerns for determining power output in the jump squat.

    Science.gov (United States)

    Cormie, Prue; Deane, Russell; McBride, Jeffrey M

    2007-05-01

    The purpose of this study was to investigate the validity of power measurement techniques during the jump squat (JS) utilizing various combinations of a force plate and linear position transducer (LPT) devices. Nine men with at least 6 months of prior resistance training experience participated in this acute investigation. One repetition maximums (1RM) in the squat were determined, followed by JS testing under 2 loading conditions (30% of 1RM [JS30] and 90% of 1RM [JS90]). Three different techniques were used simultaneously in data collection: (a) 1 linear position transducer (1-LPT); (b) 1 linear position transducer and a force plate (1-LPT + FP); and (c) 2 linear position transducers and a force place (2-LPT + FP). Vertical velocity-, force-, and power-time curves were calculated for each lift using these methodologies and were compared. Peak force and peak power were overestimated by 1-LPT in both JS30 and JS90 compared with 2-LPT + FP and 1-LPT + FP (p squat varies according to the measurement technique utilized. The 1-LPT methodology is not a valid means of determining power output in the jump squat. Furthermore, the 1-LPT + FP method may not accurately represent power output in free weight movements that involve a significant amount of horizontal motion.

  11. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test......In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

  12. Stanley Milgram’s Obedience to Authority “Relationship” Condition: Some Methodological and Theoretical Implications

    Directory of Open Access Journals (Sweden)

    Nestar Russell

    2014-04-01

    Full Text Available In May 1962, social psychologist, Stanley Milgram, ran what was arguably the most controversial variation of his Obedience to Authority (OTA experiments: the Relationship Condition (RC. In the RC, participants were required to bring a friend, with one becoming the teacher and the other the learner. The learners were covertly informed that the experiment was actually exploring whether their friend would obey an experimenter’s orders to hurt them. Learners were quickly trained in how to react to the impending “shocks”. Only 15 percent of teachers completed the RC. In an article published in 1965, Milgram discussed most of the variations on his baseline experiment, but only named the RC in passing, promising a more detailed account in his forthcoming book. However, his 1974 book failed to mention the RC and it remained unpublished until François Rochat and Andre Modigliani discovered it in Milgram’s personal archive in 1997 at Yale University. Their overview of the RC’s procedure and results left a number of questions unanswered. For example, what were the etiological origins of the RC? Why did Milgram decide against publishing this experiment? And does the RC have any significant methodological or theoretical implications on the Obedience studies discourse? Based on documents obtained from Milgram’s personal archive, the aim of this article is to shed new light on these questions.

  13. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  14. Dynamic-energetic balance of agricultural tractors: active systems for the measurement of the power requirements in static tests and under field conditions

    Directory of Open Access Journals (Sweden)

    Daniele Pochi

    2013-09-01

    Full Text Available Modern tractors are characterized by the introduction of devices designed to increase the operative performances of the machines, such as systems for monitoring and controlling various functions (through a massive use of electronics and hydraulics, or deputed to improve the comfort of the driver (paying more attention to ergonomics, air-conditioning, noise and vibration. Such devices need energy to be operated, affecting the energetic balance of the tractor. In this context, the availability of suitable methodologies and instrumental systems could be useful to provide objective, accurate and reliable measurements of the performances of the tractors under different conditions, also considering the power requirements from ancillary services and/or simulating the coupling with operating machines. The tests on the performances of tractors are now made using different methods, including the trial codes issued by the OECD Codes. Beyond their undoubted validity, they fix standard test conditions that often do not adequately represent the operative reality, so that, much remains to investigate on the actual performances provided by the tractors. From this point of view and with reference to fixed point tests, a test bench was developed for the measurement of the power required by various devices, such as transmission and air conditioning. It was used in experimental tests on a tracked tractor and on a wheeled tractor, aimed at validating the test device, measuring the power absorption related to the rotational speed of the organs of propulsion and to the characteristics curves, in order to quantify the power drawn by the transmission and by the air conditioning and assess the residual power for other tractor functions. As to field conditions, a study is being conducted at CRA-ING, within the project PTO (Mi.P.A.A.F., to develop a mobile test bench aimed at evaluating the power required by different operations, such as self displacement, traction, use of

  15. Development and validation of method for heterocyclic compounds in wine: optimization of HS-SPME conditions applying a response surface methodology.

    Science.gov (United States)

    Burin, Vívian Maria; Marchand, Stéphanie; de Revel, Gilles; Bordignon-Luiz, Marilde T

    2013-12-15

    Considering the importance of the heterocyclic compounds in terms of wine flavor, this study aims to propose a new rapid and solvent free method to quantify different classes of heterocyclic compounds, such as furans, thiophenes, thiazols and pyrazines, which are products of the Maillard reaction, in wines. The use of a central composite design and the response surface methodology to determine the best conditions allows the optimum combination of analytical variables (pH, NaCl and extraction time) to be identified. The validation was carried out using several types of wine as matrices. The method shows satisfactory repeatability (2.7%heterocyclic compounds were determined, mainly for red wines. © 2013 Elsevier B.V. All rights reserved.

  16. Physical protection evaluation methodology program development and application

    International Nuclear Information System (INIS)

    Seo, Janghoon; Yoo, Hosik

    2015-01-01

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  17. Physical protection evaluation methodology program development and application

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Janghoon; Yoo, Hosik [Korea Institute of Nuclear Non-proliferation and Control, Daejeon (Korea, Republic of)

    2015-10-15

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  18. Methodological Guidelines for Advertising Research

    DEFF Research Database (Denmark)

    Rossiter, John R.; Percy, Larry

    2017-01-01

    In this article, highly experienced advertising academics and advertising research consultants John R. Rossiter and Larry Percy present and discuss what they believe to be the seven most important methodological guidelines that need to be implemented to improve the practice of advertising research....... Their focus is on methodology, defined as first choosing a suitable theoretical framework to guide the research study and then identifying the advertising responses that need to be studied. Measurement of those responses is covered elsewhere in this special issue in the article by Bergkvist and Langner. Most...

  19. Methodology for performing surveys for fixed contamination

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1994-10-01

    This report describes a methodology for performing instrument surveys for fixed contamination that can be used to support the release of material from radiological areas, including release to controlled areas and release from radiological control. The methodology, which is based on a fast scan survey and a series of statistical, fixed measurements, meets the requirements of the U.S. Department of Energy Radiological Control Manual (RadCon Manual) (DOE 1994) and DOE Order 5400.5 (DOE 1990) for surveys for fixed contamination and requires less time than a conventional scan survey. The confidence interval associated with the new methodology conforms to the draft national standard for surveys. The methodology that is presented applies only to surveys for fixed contamination. Surveys for removable contamination are not discussed, and the new methodology does not affect surveys for removable contamination

  20. Methodology for wind turbine blade geometry optimization

    Energy Technology Data Exchange (ETDEWEB)

    Perfiliev, D.

    2013-11-01

    Nowadays, the upwind three bladed horizontal axis wind turbine is the leading player on the market. It has been found to be the best industrial compromise in the range of different turbine constructions. The current wind industry innovation is conducted in the development of individual turbine components. The blade constitutes 20-25% of the overall turbine budget. Its optimal operation in particular local economic and wind conditions is worth investigating. The blade geometry, namely the chord, twist and airfoil type distributions along the span, responds to the output measures of the blade performance. Therefore, the optimal wind blade geometry can improve the overall turbine performance. The objectives of the dissertation are focused on the development of a methodology and specific tool for the investigation of possible existing wind blade geometry adjustments. The novelty of the methodology presented in the thesis is the multiobjective perspective on wind blade geometry optimization, particularly taking simultaneously into account the local wind conditions and the issue of aerodynamic noise emissions. The presented optimization objective approach has not been investigated previously for the implementation in wind blade design. The possibilities to use different theories for the analysis and search procedures are investigated and sufficient arguments derived for the usage of proposed theories. The tool is used for the test optimization of a particular wind turbine blade. The sensitivity analysis shows the dependence of the outputs on the provided inputs, as well as its relative and absolute divergences and instabilities. The pros and cons of the proposed technique are seen from the practical implementation, which is documented in the results, analysis and conclusion sections. (orig.)

  1. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    Misra, M.K.; Menon, Saritha P.; Thirugnana Murthy, D.

    2013-01-01

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  2. Isothermal and isochronal annealing methodology to study post-irradiation temperature activated phenomena

    International Nuclear Information System (INIS)

    Chabrerie, C.; Autran, J.L.; Paillet, P.; Flament, O.; Leray, J.L.; Boudenot, J.C.

    1997-01-01

    In this work, the evolution of the oxide trapped charge has been modeled, to predict post-irradiation behavior for arbitrary anneal conditions (i.e., arbitrary temperature-time profiles). Using experimental data obtained from a single isochronal anneal, the method consists of calculating the evolution of the energy distribution of the oxide trapped charge, in the framework of a thermally activated charge detrapping model. This methodology is illustrated in this paper by the prediction of experimental isothermal data from isochronal measurements. The implications of these results to hardness assurance test methods are discussed

  3. HbA2 measurements in β-thalassemia and in other conditions

    Directory of Open Access Journals (Sweden)

    Giovanni Ivaldi

    2014-09-01

    Full Text Available Quite a few papers have been written on the significance of elevated hemoglobin (Hb A2 as a parameter for the diagnosis of β-thalassemia trait, on the cutoff values to be used in diagnostics and on the significance and effects of factors reducing or elevating the expression of HbA2 and last but not least on the need for reliable measurement methods and precise calibrations with accurate standards. However, little has been published on the causes that elevate or reduce the HbA2 levels in β- and a-thalassemia and in other conditions. For a better understanding of the value of a precise measurement of this parameter we summarize and elucidate in this review the direct and indirect mechanisms that cause the variations in HbA2 expression and that influence the value of this parameter in particular conditions. We conclude by explaining the advantages and disadvantages of trusting on a precise measurement in the complete diagnostic contest.

  4. Investigation of optimal conditions for production of highly crystalline nanocellulose with increased yield via novel Cr(III)-catalyzed hydrolysis: Response surface methodology.

    Science.gov (United States)

    Chen, You Wei; Lee, Hwei Voon; Abd Hamid, Sharifah Bee

    2017-12-15

    For the first time, a highly efficient Cr(NO 3 ) 3 catalysis system was proposed for optimization the yield and crystallinity of nanocellulose end product. A five-level three-factor central composite design coupled with response surface methodology was employed to elucidate parameters interactions between three design factors, namely reaction temperature (x 1 ), reaction time (x 2 ) and concentration of Cr(NO 3 ) 3 (x 3 ) over a broad range of process conditions and determine the effect on crystallinity index and product yield. The developed models predicted the maximum nanocellulose yield of 87% at optimum process conditions of 70.6°C, 1.48h, and 0.48M Cr(NO 3 ) 3 . At these conditions, the obtained nanocellulose presented high crystallinity index (75.3%), spider-web-like interconnected network morphology with the average width of 31.2±14.3nm. In addition, the yielded nanocellulose rendered a higher thermal stability than that of original cellulosic source and expected to be widely used as reinforcement agent in bio-nanocomposites materials. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Statistical optimization of ultraviolet irradiate conditions for vitamin D₂ synthesis in oyster mushrooms (Pleurotus ostreatus using response surface methodology.

    Directory of Open Access Journals (Sweden)

    Wei-Jie Wu

    Full Text Available Response surface methodology (RSM was used to determine the optimum vitamin D2 synthesis conditions in oyster mushrooms (Pleurotus ostreatus. Ultraviolet B (UV-B was selected as the most efficient irradiation source for the preliminary experiment, in addition to the levels of three independent variables, which included ambient temperature (25-45°C, exposure time (40-120 min, and irradiation intensity (0.6-1.2 W/m2. The statistical analysis indicated that, for the range which was studied, irradiation intensity was the most critical factor that affected vitamin D2 synthesis in oyster mushrooms. Under optimal conditions (ambient temperature of 28.16°C, UV-B intensity of 1.14 W/m2, and exposure time of 94.28 min, the experimental vitamin D2 content of 239.67 µg/g (dry weight was in very good agreement with the predicted value of 245.49 µg/g, which verified the practicability of this strategy. Compared to fresh mushrooms, the lyophilized mushroom powder can synthesize remarkably higher level of vitamin D2 (498.10 µg/g within much shorter UV-B exposure time (10 min, and thus should receive attention from the food processing industry.

  6. Methodological Challenges in Measuring Child Maltreatment

    Science.gov (United States)

    Fallon, Barbara; Trocme, Nico; Fluke, John; MacLaurin, Bruce; Tonmyr, Lil; Yuan, Ying-Ying

    2010-01-01

    Objective: This article reviewed the different surveillance systems used to monitor the extent of reported child maltreatment in North America. Methods: Key measurement and definitional differences between the surveillance systems are detailed and their potential impact on the measurement of the rate of victimization. The infrastructure…

  7. Can ensemble condition in a hall be improved and measured?

    DEFF Research Database (Denmark)

    Gade, Anders Christian

    1988-01-01

    of the ceiling reflectors; and (c) changing the position of the orchestra on the platform. These variables were then tested in full scale experiments in the hall including subjective evaluation by the orchestra in order to verify their effects under practical conditions. New objective parameters, which showed......In collaboration with the Danish Broadcasting Corporation an extensive series of experiments has been carried out in The Danish Radio Concert Hall with the practical purpose of trying to improve the ensemble conditions on the platform for the resident symphony orchestra. First, a series...... very high correlations with the subjective data, also made it possible to compare the improvements with conditions as recently measured in famous European Halls. Besides providing the needed results, the experiments also shed some light on how musicians change their criteria for judging acoustic...

  8. Analysis of the environmental conditions at Gale Crater from MSL/REMS measurements

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, G.; Torre-Juarez, M. de la; Vicente-Retortillo, A.; Kemppinen, O.; Renno, N.; Lemmon, M.

    2016-07-01

    The environmental conditions at Gale Crater during the first 1160 sols of the Mars Science Laboratory (MSL) mission are assessed using measurements taken by the Rover Environmental Monitoring Station (REMS) on-board the MSL Curiosity rover. REMS is a suite of sensors developed to assess the environmental conditions along the rover traverse. In particular, REMS has been measuring atmospheric pressure, atmospheric and ground temperature, relative humidity, UV radiation flux and wind speed. Here we analyze processed data with the highest confidence possible of atmospheric pressure, atmospheric and ground temperature and relative humidity. In addition, we estimate the daily UV irradiation at the surface of Gale Crater using dust opacity values derived from the Mastcam instrument. REMS is still in operation, but it has already provided the most comprehensive coverage of surface environmental conditions recorded by a spacecraft landed on Mars. (Author)

  9. Genetic covariance functioners for live weight, condition score, and dry-matter intake measured at different lactations stages of Holstein-Friesian heifers

    NARCIS (Netherlands)

    Koenen, E.P.C.; Veerkamp, R.F.

    1998-01-01

    Genetic parameters for live weight, body condition score and dry-matter intake of dairy heifers were estimated using covariance function methodology. Data were from 469 heifers of the Langhill Dairy Cattle Research Centre and included observations during the first 25 weeks in lactation. Genetic

  10. Patient empowerment in long-term conditions: development and preliminary testing of a new measure

    Science.gov (United States)

    2013-01-01

    Background Patient empowerment is viewed by policy makers and health care practitioners as a mechanism to help patients with long-term conditions better manage their health and achieve better outcomes. However, assessing the role of empowerment is dependent on effective measures of empowerment. Although many measures of empowerment exist, no measure has been developed specifically for patients with long-term conditions in the primary care setting. This study presents preliminary data on the development and validation of such a measure. Methods We conducted two empirical studies. Study one was an interview study to understand empowerment from the perspective of patients living with long-term conditions. Qualitative analysis identified dimensions of empowerment, and the qualitative data were used to generate items relating to these dimensions. Study two was a cross-sectional postal study involving patients with different types of long-term conditions recruited from general practices. The survey was conducted to test and validate our new measure of empowerment. Factor analysis and regression were performed to test scale structure, internal consistency and construct validity. Results Sixteen predominately elderly patients with different types of long-term conditions described empowerment in terms of 5 dimensions (identity, knowledge and understanding, personal control, personal decision-making, and enabling other patients). One hundred and ninety seven survey responses were received from mainly older white females, with relatively low levels of formal education, with the majority retired from paid work. Almost half of the sample reported cardiovascular, joint or diabetes long-term conditions. Factor analysis identified a three factor solution (positive attitude and sense of control, knowledge and confidence in decision making and enabling others), although the structure lacked clarity. A total empowerment score across all items showed acceptable levels of internal

  11. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    International Nuclear Information System (INIS)

    2013-01-01

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results

  12. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-12-15

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results.

  13. Towards standardized testing methodologies for optical properties of components in concentrating solar thermal power plants

    Science.gov (United States)

    Sallaberry, Fabienne; Fernández-García, Aránzazu; Lüpfert, Eckhard; Morales, Angel; Vicente, Gema San; Sutter, Florian

    2017-06-01

    Precise knowledge of the optical properties of the components used in the solar field of concentrating solar thermal power plants is primordial to ensure their optimum power production. Those properties are measured and evaluated by different techniques and equipment, in laboratory conditions and/or in the field. Standards for such measurements and international consensus for the appropriate techniques are in preparation. The reference materials used as a standard for the calibration of the equipment are under discussion. This paper summarizes current testing methodologies and guidelines for the characterization of optical properties of solar mirrors and absorbers.

  14. Social and economic well-being in the conditions of the urban space: the evolution of methodological approaches in the historical urban studies

    Directory of Open Access Journals (Sweden)

    Ageev Ilya

    2016-01-01

    Full Text Available A city as a type of a human settlement is characterized by high population density, welldeveloped infrastructure, comfortable living conditions. However, a city is a source of social problems due to high population density, limited resources and conflicts between indigenous population and newcomers. The article analyzes the development of research about the city, provides an assessment of the scope of the historical urban studies in the development of solutions to contemporary problems of urban space. Methodological resource of historical urban studies allows fully exploring the city as a set of historically interconnected spaces and social processes. The analysis of the problem field of historical urban studies at various stages of its formation allowed tracing the evolution of ideas about the city as an object of scientific knowledge, to identify future prospects of research on conditions of Russian urban development, to improve the comfort of living in them.

  15. Quantitative approach for optimizing e-beam condition of photoresist inspection and measurement

    Science.gov (United States)

    Lin, Chia-Jen; Teng, Chia-Hao; Cheng, Po-Chung; Sato, Yoshishige; Huang, Shang-Chieh; Chen, Chu-En; Maruyama, Kotaro; Yamazaki, Yuichiro

    2018-03-01

    Severe process margin in advanced technology node of semiconductor device is controlled by e-beam metrology system and e-beam inspection system with scanning electron microscopy (SEM) image. By using SEM, larger area image with higher image quality is required to collect massive amount of data for metrology and to detect defect in a large area for inspection. Although photoresist is the one of the critical process in semiconductor device manufacturing, observing photoresist pattern by SEM image is crucial and troublesome especially in the case of large image. The charging effect by e-beam irradiation on photoresist pattern causes deterioration of image quality, and it affect CD variation on metrology system and causes difficulties to continue defect inspection in a long time for a large area. In this study, we established a quantitative approach for optimizing e-beam condition with "Die to Database" algorithm of NGR3500 on photoresist pattern to minimize charging effect. And we enhanced the performance of measurement and inspection on photoresist pattern by using optimized e-beam condition. NGR3500 is the geometry verification system based on "Die to Database" algorithm which compares SEM image with design data [1]. By comparing SEM image and design data, key performance indicator (KPI) of SEM image such as "Sharpness", "S/N", "Gray level variation in FOV", "Image shift" can be retrieved. These KPIs were analyzed with different e-beam conditions which consist of "Landing Energy", "Probe Current", "Scanning Speed" and "Scanning Method", and the best e-beam condition could be achieved with maximum image quality, maximum scanning speed and minimum image shift. On this quantitative approach of optimizing e-beam condition, we could observe dependency of SEM condition on photoresist charging. By using optimized e-beam condition, measurement could be continued on photoresist pattern over 24 hours stably. KPIs of SEM image proved image quality during measurement and

  16. Modeling of the effect of freezer conditions on the principal constituent parameters of ice cream by using response surface methodology.

    Science.gov (United States)

    Inoue, K; Ochi, H; Taketsuka, M; Saito, H; Sakurai, K; Ichihashi, N; Iwatsuki, K; Kokubo, S

    2008-05-01

    A systematic analysis was carried out by using response surface methodology to create a quantitative model of the synergistic effects of conditions in a continuous freezer [mix flow rate (L/h), overrun (%), cylinder pressure (kPa), drawing temperature ( degrees C), and dasher speed (rpm)] on the principal constituent parameters of ice cream [rate of fat destabilization (%), mean air cell diameter (mum), and mean ice crystal diameter (mum)]. A central composite face-centered design was used for this study. Thirty-one combinations of the 5 above-mentioned freezer conditions were designed (including replicates at the center point), and ice cream samples were manufactured and examined in a continuous freezer under the selected conditions. The responses were the 3 variables given above. A quadratic model was constructed, with the freezer conditions as the independent variables and the ice cream characteristics as the dependent variables. The coefficients of determination (R(2)) were greater than 0.9 for all 3 responses, but Q(2), the index used here for the capability of the model for predicting future observed values of the responses, was negative for both the mean ice crystal diameter and the mean air cell diameter. Therefore, pruned models were constructed by removing terms that had contributed little to the prediction in the original model and by refitting the regression model. It was demonstrated that these pruned models provided good fits to the data in terms of R(2), Q(2), and ANOVA. The effects of freezer conditions were expressed quantitatively in terms of the 3 responses. The drawing temperature ( degrees C) was found to have a greater effect on ice cream characteristics than any of the other factors.

  17. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Yoo, Hosik; Lee, Nayoung; Ham, Taekyu; Seo, Janghoon

    2015-01-01

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  18. Conditional stability for a single interior measurement

    International Nuclear Information System (INIS)

    Honda, Naofumi; McLaughlin, Joyce; Nakamura, Gen

    2014-01-01

    An inverse problem to identify unknown coefficients of a partial differential equation by a single interior measurement is considered. The equation considered in this paper is a strongly elliptic second order scalar equation which can have complex coefficients in a bounded domain with C 2 boundary. We are given a single interior measurement. This means that we know a given solution of the forward equation in this domain. The equation includes some model equations arising from acoustics, viscoelasticity and hydrology. We assume that the coefficients are piecewise analytic. Our major result is the local Hölder stability estimate for identifying the unknown coefficients. If the unknown coefficient is a complex coefficient in the principal part of the equation, we assumed a condition which we name admissibility assumption for the real part and imaginary part of the difference of two complex coefficients. This admissibility assumption is automatically satisfied if the complex coefficients are real valued. For identifying either the real coefficient in the principal part or the coefficient of the 0th order of the equation, the major result implies global uniqueness for the identification. (paper)

  19. Identification of complex model thermal boundary conditions based on exterior temperature measurement

    International Nuclear Information System (INIS)

    Lu Jianming; Ouyang Guangyao; Zhang Ping; Rong Bojun

    2012-01-01

    Combining the advantages of the finite element software in temperature field analyzing with the multivariate function optimization arithmetic, a feasibility method based on the exterior temperature was proposed to get the thermal boundary conditions, which was required in temperature field analyzing. The thermal boundary conditions can be obtained only by some temperature measurement values. Taking the identification of the convection heat transfer coefficient of a high power density diesel engine cylinder head as an example, the calculation result shows that when the temperature measurement error was less than 0.5℃, the maximum relative error was less than 2%. It is shown that the new method was feasible (authors)

  20. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  1. Chronic condition self-management surveillance: what is and what should be measured?

    Science.gov (United States)

    Ruiz, Sarah; Brady, Teresa J; Glasgow, Russell E; Birkel, Richard; Spafford, Michelle

    2014-06-19

    The rapid growth in chronic disease prevalence, in particular the prevalence of multiple chronic conditions, poses a significant and increasing burden on the health of Americans. Maximizing the use of proven self-management (SM) strategies is a core goal of the US Department of Health and Human Services. Yet, there is no systematic way to assess how much SM or self-management support (SMS) is occurring in the United States. The purpose of this project was to identify appropriate concepts or measures to incorporate into national SM and SMS surveillance. A multistep process was used to identify candidate concepts, assess existing measures, and select high-priority concepts for further development. A stakeholder survey, an environmental scan, subject matter expert feedback, and a stakeholder priority-setting exercise were all used to select the high-priority concepts for development. The stakeholder survey gathered feedback on 32 candidate concepts; 9 concepts were endorsed by more than 66% of respondents. The environmental scan indicated few existing measures that adequately reflected the candidate concepts, and those that were identified were generally specific to a defined condition and not gathered on a population basis. On the basis of the priority setting exercises and environmental scan, we selected 1 concept from each of 5 levels of behavioral influence for immediate development as an SM or SMS indicator. The absence of any available measures to assess SM or SMS across the population highlights the need to develop chronic condition SM surveillance that uses national surveys and other data sources to measure national progress in SM and SMS.

  2. Methodological aspects and development of techniques for neutron activation analysis of microcomponents in materials of geologic origin

    International Nuclear Information System (INIS)

    Cohen, I.M.

    1982-01-01

    Some aspects of the activation analysis methodology applied to geological samples activated in nuclear reactors were studied, and techniques were developed for the determination of various elements in different types of matrixes, using gamma spectrometry for the measurement of the products. The consideration of the methodological aspects includes the study of the working conditions, the preparation of samples and standards, irradiations, treatment of the irradiated material, radiochemical separation and measurement. Experiments were carried out on reproducibility and errors in relation to the behaviour of the measurement equipment and that of the methods of area calculation (total area, Covell and Wasson), as well as on the effects of geometry variations on the results of the measurements, the RA-3 reactors's flux variations, and the homogeneity of the samples and standards. Also studied were: the selection of the conditions of determination, including the irradiation and decay times; the irradiation with thermal and epithermal neutrons; the measurement with the use of absorbers, and the resolution of complex peaks. Both non-destructive and radiochemical separation techniques were developed for the analysis of 5 types of geological materials. These methods were applied to the following determinations: a) In, Cd, Mn, Ga and Co in blende; b) La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb and Lu in fluorites; c) La, Ca, Eu, Tb, Yb, Se and Th in barites and celestites; d) Cu and Zn in soils. The spectral interferences or those due to nuclear reactions were studied and evaluated by mathematical calculation. (M.E.L.) [es

  3. Experimental investigation on local parameter measurement using optical probes in two-phase flow under rolling condition

    International Nuclear Information System (INIS)

    Tian Daogui; Sun Licheng; Yan Changqi; Liu Guoqiang

    2013-01-01

    In order to get more local interfacial information as well as to further comprehend the intrinsic mechanism of two-phase flow under rolling condition, a method was proposed to measure the local parameters by using optical probes under rolling condition in this paper. An experimental investigation of two-phase flow under rolling condition was conducted using the probe fabricated by the authors. It is verified that the probe method is feasible to measure the local parameters in two'-phase flow under rolling condition. The results show that the interfacial parameters distribution near wall region has a distinct periodicity due to the rolling motion. The averaged deviation of the void fraction measured by the probe from that obtained from measured pressure drop is about 8%. (authors)

  4. Characterization of Melanogenesis Inhibitory Constituents of Morus alba Leaves and Optimization of Extraction Conditions Using Response Surface Methodology.

    Science.gov (United States)

    Jeong, Ji Yeon; Liu, Qing; Kim, Seon Beom; Jo, Yang Hee; Mo, Eun Jin; Yang, Hyo Hee; Song, Dae Hye; Hwang, Bang Yeon; Lee, Mi Kyeong

    2015-05-14

    Melanin is a natural pigment that plays an important role in the protection of skin, however, hyperpigmentation cause by excessive levels of melatonin is associated with several problems. Therefore, melanogenesis inhibitory natural products have been developed by the cosmetic industry as skin medications. The leaves of Morus alba (Moraceae) have been reported to inhibit melanogenesis, therefore, characterization of the melanogenesis inhibitory constituents of M. alba leaves was attempted in this study. Twenty compounds including eight benzofurans, 10 flavonoids, one stilbenoid and one chalcone were isolated from M. alba leaves and these phenolic constituents were shown to significantly inhibit tyrosinase activity and melanin content in B6F10 melanoma cells. To maximize the melanogenesis inhibitory activity and active phenolic contents, optimized M. alba leave extraction conditions were predicted using response surface methodology as a methanol concentration of 85.2%; an extraction temperature of 53.2 °C and an extraction time of 2 h. The tyrosinase inhibition and total phenolic content under optimal conditions were found to be 74.8% inhibition and 24.8 μg GAE/mg extract, which were well-matched with the predicted values of 75.0% inhibition and 23.8 μg GAE/mg extract. These results shall provide useful information about melanogenesis inhibitory constituents and optimized extracts from M. alba leaves as cosmetic therapeutics to reduce skin hyperpigmentation.

  5. Characterization of Melanogenesis Inhibitory Constituents of Morus alba Leaves and Optimization of Extraction Conditions Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Ji Yeon Jeong

    2015-05-01

    Full Text Available Melanin is a natural pigment that plays an important role in the protection of skin, however, hyperpigmentation cause by excessive levels of melatonin is associated with several problems. Therefore, melanogenesis inhibitory natural products have been developed by the cosmetic industry as skin medications. The leaves of Morus alba (Moraceae have been reported to inhibit melanogenesis, therefore, characterization of the melanogenesis inhibitory constituents of M. alba leaves was attempted in this study. Twenty compounds including eight benzofurans, 10 flavonoids, one stilbenoid and one chalcone were isolated from M. alba leaves and these phenolic constituents were shown to significantly inhibit tyrosinase activity and melanin content in B6F10 melanoma cells. To maximize the melanogenesis inhibitory activity and active phenolic contents, optimized M. alba leave extraction conditions were predicted using response surface methodology as a methanol concentration of 85.2%; an extraction temperature of 53.2 °C and an extraction time of 2 h. The tyrosinase inhibition and total phenolic content under optimal conditions were found to be 74.8% inhibition and 24.8 μg GAE/mg extract, which were well-matched with the predicted values of 75.0% inhibition and 23.8 μg GAE/mg extract. These results shall provide useful information about melanogenesis inhibitory constituents and optimized extracts from M. alba leaves as cosmetic therapeutics to reduce skin hyperpigmentation.

  6. Multi-Population Invariance with Dichotomous Measures: Combining Multi-Group and MIMIC Methodologies in Evaluating the General Aptitude Test in the Arabic Language

    Science.gov (United States)

    Sideridis, Georgios D.; Tsaousis, Ioannis; Al-harbi, Khaleel A.

    2015-01-01

    The purpose of the present study was to extend the model of measurement invariance by simultaneously estimating invariance across multiple populations in the dichotomous instrument case using multi-group confirmatory factor analytic and multiple indicator multiple causes (MIMIC) methodologies. Using the Arabic version of the General Aptitude Test…

  7. Ecosystem-atmosphere exchange of carbon in a heathland under future climatic conditions

    DEFF Research Database (Denmark)

    Selsted, Merete Bang

    on ecosystem-atmosphere exchange of carbon in a heathland under future climatic conditions, shows that extended summer drought in combination with elevated temperature will ensure permanent dryer soil conditions, which decreases carbon turnover, while elevated atmospheric CO2 concentrations will increase...... carbon turnover. In the full future climate scenario, carbon turnover is over all expected to increase and the heathland to become a source of atmospheric CO2. The methodology of static chamber CO2 flux measurements and applying the technology in a FACE (free air CO2 enrichment) facility is a challenge...... on the atmospheric CO2 concentration. Photosynthesis and respiration run in parallel during measurements of net ecosystem exchange, and these measurements should therefore be performed with care to both the atmospheric CO2 concentration and the CO2 soil-atmosphere gradient....

  8. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from

  9. Radiation protection optimization using a knowledge based methodology

    International Nuclear Information System (INIS)

    Reyes-Jimenez, J.; Tsoukalas, L.H.

    1991-01-01

    This paper presents a knowledge based methodology for radiological planning and radiation protection optimization. The cost-benefit methodology described on International Commission of Radiation Protection Report No. 37 is employed within a knowledge based framework for the purpose of optimizing radiation protection and plan maintenance activities while optimizing radiation protection. 1, 2 The methodology is demonstrated through an application to a heating ventilation and air conditioning (HVAC) system. HVAC is used to reduce radioactivity concentration levels in selected contaminated multi-compartment models at nuclear power plants when higher than normal radiation levels are detected. The overall objective is to reduce personnel exposure resulting from airborne radioactivity, when routine or maintenance access is required in contaminated areas. 2 figs, 15 refs

  10. Feasibility, strategy, methodology, and analysis of probe measurements in plasma under high gas pressure

    Science.gov (United States)

    Demidov, V. I.; Koepke, M. E.; Kurlyandskaya, I. P.; Malkov, M. A.

    2018-02-01

    This paper reviews existing theories for interpreting probe measurements of electron distribution functions (EDF) at high gas pressure when collisions of electrons with atoms and/or molecules near the probe are pervasive. An explanation of whether or not the measurements are realizable and reliable, an enumeration of the most common sources of measurement error, and an outline of proper probe-experiment design elements that inherently limit or avoid error is presented. Additionally, we describe recent expanded plasma-condition compatibility for EDF measurement, including in applications of large wall probe plasma diagnostics. This summary of the authors’ experiences gained over decades of practicing and developing probe diagnostics is intended to inform, guide, suggest, and detail the advantages and disadvantages of probe application in plasma research.

  11. Methodology of the Auditing Measures to Civil Airport Security and Protection

    Directory of Open Access Journals (Sweden)

    Ján Kolesár

    2016-10-01

    Full Text Available Airports similarly to other companies are certified in compliance with the International Standardization Organization (ISO standards of products and services (series of ISO 9000 Standards regarding quality management, to coordinate the technical side of standardizatioon and normalization at an international scale. In order for the airports to meet the norms and the certification requirements as by the ISO they are liable to undergo strict audits of quality, as a rule, conducted by an independent auditing organization. Focus of the audits is primarily on airport operation economics and security. The article is an analysis into the methodology of the airport security audit processes and activities. Within the framework of planning, the sequence of steps is described in line with the principles and procedures of the Security Management System (SMS and starndards established by the International Standardization Organization (ISO. The methodology of conducting airport security audit is developed in compliance with the national programme and international legislation standards (Annex 17 applicable to protection of civil aviation against acts of unlawful interference.

  12. Eddy correlation measurements in wet environmental conditions

    Science.gov (United States)

    Cuenca, R. H.; Migliori, L.; O Kane, J. P.

    2003-04-01

    The lower Feale catchment is a low-lying peaty area of 200 km^2 situated in southwest Ireland that is subject to inundation by flooding. The catchment lies adjacent to the Feale River and is subject to tidal signals as well as runoff processes. Various mitigation strategies are being investigated to reduce the damage due to flooding. Part of the effort has required development of a detailed hydrologic balance for the study area which is a wet pasture environment with local field drains that are typically flooded. An eddy correlation system was installed in the summer of 2002 to measure components of the energy balance, including evapotranspiration, along with special sensors to measure other hydrologic variables particular to this study. Data collected will be essential for validation of surface flux models to be developed for this site. Data filtering is performed using a combination of software developed by the Boundary-Layer Group (BLG) at Oregon State University together with modifications made to this system for conditions at this site. This automated procedure greatly reduces the tedious inspection of individual records. The package of tests, developed by the BLG for both tower and aircraft high frequency data, checks for electronic spiking, signal dropout, unrealistic magnitudes, extreme higher moment statistics, as well as other error scenarios not covered by the instrumentation diagnostics built into the system. Critical parameter values for each potential error were developed by applying the tests to real fast response turbulent time series. Potential instrumentation problems, flux sampling problems, and unusual physical situations records are flagged for removal or further analysis. A final visual inspection step is required to minimize rejection of physically unusual but real behavior in the time series. The problems of data management, data quality control, individual instrumentation sensitivity, potential underestimation of latent and sensible heat

  13. Counting stem cells : methodological constraints

    NARCIS (Netherlands)

    Bystrykh, Leonid V.; Verovskaya, Evgenia; Zwart, Erik; Broekhuis, Mathilde; de Haan, Gerald

    The number of stem cells contributing to hematopoiesis has been a matter of debate. Many studies use retroviral tagging of stem cells to measure clonal contribution. Here we argue that methodological factors can impact such clonal analyses. Whereas early studies had low resolution, leading to

  14. A methodology for costing man-rem

    International Nuclear Information System (INIS)

    Bieber, C.

    1976-03-01

    An attempt is made to provide a methodology for costing man-rem in a way that can be applied to station conditions, based on 1974 Pickering G.S. data. Factors taken into account were social costs, exposure costs (dose accounting, training, dosimetry) temporary labour costs, and permanent replacement labour costs. A figure of $620/ man-rem was derived. (LL)

  15. Numerical abilities in fish: A methodological review.

    Science.gov (United States)

    Agrillo, Christian; Miletto Petrazzini, Maria Elena; Bisazza, Angelo

    2017-08-01

    The ability to utilize numerical information can be adaptive in a number of ecological contexts including foraging, mating, parental care, and anti-predator strategies. Numerical abilities of mammals and birds have been studied both in natural conditions and in controlled laboratory conditions using a variety of approaches. During the last decade this ability was also investigated in some fish species. Here we reviewed the main methods used to study this group, highlighting the strengths and weaknesses of each of the methods used. Fish have only been studied under laboratory conditions and among the methods used with other species, only two have been systematically used in fish-spontaneous choice tests and discrimination learning procedures. In the former case, the choice between two options is observed in a biologically relevant situation and the degree of preference for the larger/smaller group is taken as a measure of the capacity to discriminate the two quantities (e.g., two shoals differing in number). In discrimination learning tasks, fish are trained to select the larger or the smaller of two sets of abstract objects, typically two-dimensional geometric figures, using food or social companions as reward. Beyond methodological differences, what emerges from the literature is a substantial similarity of the numerical abilities of fish with those of other vertebrates studied. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  17. The measurement of interplanetary scintillations in conditions of strong radio interference

    International Nuclear Information System (INIS)

    Duffett-Smith, P.J.

    1980-01-01

    Observations of interplanetary scintillations (IPS) are often severely limited by interference from man-made transmissions within the receiver pass-band. A new method of measuring IPS is described which can give useful data even in conditions of bad interference. (author)

  18. Report on use of a methodology for commissioning and quality assurance of a VMAT system.

    Directory of Open Access Journals (Sweden)

    Charles Mayo

    Full Text Available INTRODUCTION: Results of use of methodology for VMAT commissioning and quality assurance, utilizing both control point tests and dosimetric measurements are presented. METHODS AND MATERIALS: A generalizable, phantom measurement approach is used to characterize the accuracy of the measurement system. Correction for angular response of the measurement system and inclusion of couch structures are used to characterize the full range gantry angles desirable for clinical plans. A dose based daily QA measurement approach is defined. RESULTS: Agreement in the static vs. VMAT picket fence control point test was better than 0.5 mm. Control point tests varying gantry rotation speed, leaf speed and dose rate, demonstrated agreement with predicted values better than 1%. Angular dependence of the MatriXX array, varied over a range of 0.94-1.06, with respect to the calibration condition. Phantom measurements demonstrated central axis dose accuracy for un-modulated four field box plans was ≥2.5% vs. 1% with and without angular correction respectively with better results for VMAT (0.4% vs. IMRT (1.6% plans. Daily QA results demonstrated average agreement all three chambers within 0.4% over 9 month period with no false positives at a 3% threshold. DISCUSSION: The methodology described is simple in design and characterizes both the inherit limitations of the measurement system as well at the dose based measurements that may be directly related to patient plan QA.

  19. Methods for measuring of fuel can deformation under radiation conditions

    International Nuclear Information System (INIS)

    Zelenchuk, A.V.; Fetisov, B.V.; Lakin, Yu.G.; Tonkov, V.Yu.

    1978-01-01

    The possibility for measuring fuel can deformation under radiation conditions by means of the acoustic method and tensoresistors is considered. The construction and operation of the in-pile facility for measuring creep of the fuel can specimen loaded by the internal pressure is described. The data on neutron radiation effect on changes in creep rate for zirconium fuel can are presented. The results obtained with tensoresistors are in a good agreement with those obtained by the acoustic method, which enables to recommend the use of both methods for the irradiation creep investigation of the fuel element cans

  20. Standardization of test conditions for gamma camera performance measurement

    International Nuclear Information System (INIS)

    Jordan, K.

    1982-02-01

    The way of measuring gamma camera performance is to use point sources or flood sources in air, often in combination with bar phantoms. This method has nothing in common with the use of a camera in clinical practice. Particularly in the case of low energy emitters, like Tc-99m, the influence of scattered radiation over the performance of cameras is very high. The IEC document 'Characteristics and test conditions of radionuclide imaging devices' is discussed

  1. Climate change impact on shallow groundwater conditions in Hungary: Conclusions from a regional modelling study

    Science.gov (United States)

    Kovács, Attila; Marton, Annamária; Tóth, György; Szöcs, Teodóra

    2016-04-01

    A quantitative methodology has been developed for the calculation of groundwater table based on measured and simulated climate parameters. The aim of the study was to develop a toolset which can be used for the calculation of shallow groundwater conditions for various climate scenarios. This was done with the goal of facilitating the assessment of climate impact and vulnerability of shallow groundwater resources. The simulated groundwater table distributions are representative of groundwater conditions at the regional scale. The introduced methodology is valid for modelling purposes at various scales and thus represents a versatile tool for the assessment of climate vulnerability of shallow groundwater bodies. The calculation modules include the following: 1. A toolset to calculate climate zonation from climate parameter grids, 2. Delineation of recharge zones (Hydrological Response Units, HRUs) based on geology, landuse and slope conditions, 3. Calculation of percolation (recharge) rates using 1D analytical hydrological models, 4. Simulation of the groundwater table using numerical groundwater flow models. The applied methodology provides a quantitative link between climate conditions and shallow groundwater conditions, and thus can be used for assessing climate impacts. The climate data source applied in our calculation comprised interpolated daily climate data of the Central European CARPATCLIM database. Climate zones were determined making use of the Thorntwaite climate zonation scheme. Recharge zones (HRUs) were determined based on surface geology, landuse and slope conditions. The HELP hydrological model was used for the calculation of 1D water balance for hydrological response units. The MODFLOW numerical groundwater modelling code was used for the calculation of the water table. The developed methodology was demonstrated through the simulation of regional groundwater table using spatially averaged climate data and hydrogeological properties for various time

  2. The comparison of various approach to evaluation erosion risks and design control erosion measures

    Science.gov (United States)

    Kapicka, Jiri

    2015-04-01

    In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas

  3. Improving Training in Methodology Enriches the Science of Psychology

    Science.gov (United States)

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2009-01-01

    Replies to the comment Ramifications of increased training in quantitative methodology by Herbet Zimiles on the current authors original article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America". The…

  4. Measuring alterations in oscillatory brain networks in schizophrenia with resting-state MEG: State-of-the-art and methodological challenges.

    Science.gov (United States)

    Alamian, Golnoush; Hincapié, Ana-Sofía; Pascarella, Annalisa; Thiery, Thomas; Combrisson, Etienne; Saive, Anne-Lise; Martel, Véronique; Althukov, Dmitrii; Haesebaert, Frédéric; Jerbi, Karim

    2017-09-01

    Neuroimaging studies provide evidence of disturbed resting-state brain networks in Schizophrenia (SZ). However, untangling the neuronal mechanisms that subserve these baseline alterations requires measurement of their electrophysiological underpinnings. This systematic review specifically investigates the contributions of resting-state Magnetoencephalography (MEG) in elucidating abnormal neural organization in SZ patients. A systematic literature review of resting-state MEG studies in SZ was conducted. This literature is discussed in relation to findings from resting-state fMRI and EEG, as well as to task-based MEG research in SZ population. Importantly, methodological limitations are considered and recommendations to overcome current limitations are proposed. Resting-state MEG literature in SZ points towards altered local and long-range oscillatory network dynamics in various frequency bands. Critical methodological challenges with respect to experiment design, and data collection and analysis need to be taken into consideration. Spontaneous MEG data show that local and global neural organization is altered in SZ patients. MEG is a highly promising tool to fill in knowledge gaps about the neurophysiology of SZ. However, to reach its fullest potential, basic methodological challenges need to be overcome. MEG-based resting-state power and connectivity findings could be great assets to clinical and translational research in psychiatry, and SZ in particular. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  5. Measurement of fat in the ovine milk: comparative study of the results acquired by official methodology and by electr onic equipments

    Directory of Open Access Journals (Sweden)

    Luiz Gustavo de Pellegrini

    2016-06-01

    Full Text Available The aim of this work was to perform a comparative study between the official method recommended by Brazilian laws and the electronic equipment of photometric measurement and ultrasound spectroscopy equipment for the quantification of total lipids of the ovine milk in order to check which equipment establishes the lipids level better. The experiment took place at Technology of Food Department in Santa Maria Federal University together with School Plant of Dairy products and Ovine culture Section of Zoo Technical Department. It was used twelve sheeps half Lacaune Lait blood, milked individually from the first to the tenth week of lactation. The milking was performed manually and the analyses took place after the refrigeration of the samples. Before executing the analyses, the samples were homogenized and soon after evaluated in terms of fat amount by three distinct methodologies: official Brazilian methodology through Gerber’s butyrometer (GB, electronic equipment of photometric measurement Milko-Tester® (MT and ultrasound spectroscopic equipment Lactoscan 90® (LS, which all analyses were performed in triple. The reproducibility of LS equipment was 100% for the analyzed samples, while MT equipment showed variability in most of the analyzed samples obtaining reproducibility of the results in just 22,5% of the samples. For the others samples the latter equipment obtained 50% of overrated values and 27,5% underrated values. Therefore, the results of this study let us to settle that the analysis of ovine Milk by ultrasound spectroscopy is efficient for the fat parameter when compared to the official Brazilian methodology.

  6. RAMS data collection under Arctic conditions

    International Nuclear Information System (INIS)

    Barabadi, Abbas; Tobias Gudmestad, Ove; Barabady, Javad

    2015-01-01

    Reliability, availability, maintainability and supportability analysis is an important step in the design and operation of production processes and technology. Historical data such as time between failures and time to repairs play an important role in such analysis. The data must reflect the conditions that equipment has experienced during its operating time. To have a precise understanding of the conditions experienced, all influence factors on the failure and repair processes of a production facility in Arctic environment need to be identified and collected in the database. However, there is a lack of attention to collect the effect of influence factors in the reliability, availability, maintainability and supportability database. Hence, the aim of this paper is to discuss the challenges of the available methods of data collection and suggest a methodology for data collection considering the effect of environmental conditions. Application of the methodology will make the historical RAMS data of a system more applicable and useful for the design and operation of the system in different types of operational environments. - Highlights: • The challenges related to use of the available RAMS data is discussed. • It is important to collect information about operational condition in RAMS data. • A methodology for RAMS data collection considering environment condition is suggested. • Information about influence factors will make the result of RAMS analysis more applicable

  7. Radiometric ratio characterization for low-to-mid CPV modules operating in variable irradiance conditions

    Science.gov (United States)

    Vorndran, Shelby; Russo, Juan; Zhang, Deming; Gordon, Michael; Kostuk, Raymond

    2012-10-01

    In this work, a concentrating photovoltaic (CPV) design methodology is proposed which aims to maximize system efficiency for a given irradiance condition. In this technique, the acceptance angle of the system is radiometrically matched to the angular spread of the site's average irradiance conditions using a simple geometric ratio. The optical efficiency of CPV systems from flat-plate to high-concentration is plotted at all irradiance conditions. Concentrator systems are measured outdoors in various irradiance conditions to test the methodology. This modeling technique is valuable at the design stage to determine the ideal level of concentration for a CPV module. It requires only two inputs: the acceptance angle profile of the system and the site's average direct and diffuse irradiance fractions. Acceptance angle can be determined by raytracing or testing a fabricated prototype in the lab with a solar simulator. The average irradiance conditions can be found in the Typical Metrological Year (TMY3) database. Additionally, the information gained from this technique can be used to determine tracking tolerance, quantify power loss during an isolated weather event, and do more sophisticated analysis such as I-V curve simulation.

  8. Holdup measurements under realistic conditions

    International Nuclear Information System (INIS)

    Sprinkel, J.K. Jr.; Marshall, R.; Russo, P.A.; Siebelist, R.

    1997-01-01

    This paper reviews the documentation of the precision and bias of holdup (residual nuclear material remaining in processing equipment) measurements and presents previously unreported results. Precision and bias results for holdup measurements are reported from training seminars with simulated holdup, which represent the best possible results, and compared to actual plutonium processing facility measurements. Holdup measurements for plutonium and uranium processing plants are also compared to reference values. Recommendations for measuring holdup are provided for highly enriched uranium facilities and for low enriched uranium facilities. The random error component of holdup measurements is less than the systematic error component. The most likely factor in measurement error is incorrect assumptions about the measurement, such as background, measurement geometry, or signal attenuation. Measurement precision on the order of 10% can be achieved with some difficulty. Bias of poor quality holdup measurement can also be improved. However, for most facilities, holdup measurement errors have no significant impact on inventory difference, sigma, or safety (criticality, radiation, or environmental); therefore, it is difficult to justify the allocation of more resources to improving holdup measurements. 25 refs., 10 tabs

  9. Studies on site characterization methodologies for high level radioactive waste disposal

    International Nuclear Information System (INIS)

    Wang Ju; Guo Yonghai; Chen Weiming

    2008-01-01

    This paper presents the final achievement of the project 'Studies of Site-specific Geological Environment for High Level Waste Disposal and Performance Assessment Methodology, Part Ⅰ: Studies on Site Characterization Methodologies for High Level Radioactive Waste Disposal', which is a 'Key Scientific and Technological Pre-Research Project for National Defense' during 2001-2005. The study area is Beishan area, Gansu Province, NW China--the most potential site for China's underground research laboratory and high level radioactive waste repository. The boreholes BS01, BS2, BS03 and BS04 drilled in fractured granite media in Beishan are used to conduct comprehensive studies on site characterization methodologies, including: bore hole drilling method, in situ measurement methods of hydrogeological parameters, underground water sampling technology, hydrogeochemical logging method, geo-stress measurement method, acoustic borehole televiewer measurement method, borehole radar measurement method, fault stability evaluation methods and rock joint evaluation method. The execution of the project has resulted in the establishment of an 'Integrated Methodological System for Site Characterization in Granite Site for High Level Radioactive Waste Repository' and the 8 key methodologies for site characterization: bore hole drilling method with minimum disturbance to rock mass, measurement method for hydrogeological parameters of fracture granite mass, in situ groundwater sampling methods from bore holes in fractured granite mass, fracture measurement methods by borehole televiewer and bore radar system, hydrogeochemical logging, low permeability measurement methods, geophysical methods for rock mass evaluation, modeling methods for rock joints. Those methods are comprehensive, advanced, innovative, practical, reliable and of high accuracy. The comprehensive utilization of those methods in granite mass will help to obtain systematic parameters of

  10. Methodological assessment of skin and limb blood flows in the human forearm during thermal and baroreceptor provocations.

    Science.gov (United States)

    Brothers, R Matthew; Wingo, Jonathan E; Hubing, Kimberly A; Crandall, Craig G

    2010-09-01

    Skin blood flow responses in the human forearm, assessed by three commonly used technologies-single-point laser-Doppler flowmetry, integrated laser-Doppler flowmetry, and laser-Doppler imaging-were compared in eight subjects during normothermic baseline, acute skin-surface cooling, and whole body heat stress (Δ internal temperature=1.0±0.2 degrees C; Pforearm blood flow (FBF) measures obtained using venous occlusion plethysmography and Doppler ultrasound were made during the aforementioned perturbations. Relative to normothermic baseline, skin blood flow decreased during normothermia+LBNP (Pcooling (Peffect of device: P>0.05 for all conditions). Similarly, no differences were identified across all perturbations between FBF measures using plethysmography and Doppler ultrasound (P>0.05 for all perturbations). These data indicate that when normalized to maximum, assessment of skin blood flow in response to vasoconstrictor and dilator perturbations are similar regardless of methodology. Likewise, FBF responses to these perturbations are similar between two commonly used methodologies of limb blood flow assessment.

  11. Insights into PRA methodologies

    International Nuclear Information System (INIS)

    Gallagher, D.; Lofgren, E.; Atefi, B.; Liner, R.; Blond, R.; Amico, P.

    1984-08-01

    Probabilistic Risk Assessments (PRAs) for six nuclear power plants were examined to gain insight into how the choice of analytical methods can affect the results of PRAs. The PRA sreflectope considered was limited to internally initiated accidents sequences through core melt. For twenty methodological topic areas, a baseline or minimal methodology was specified. The choice of methods for each topic in the six PRAs was characterized in terms of the incremental level of effort above the baseline. A higher level of effort generally reflects a higher level of detail or a higher degree of sophistication in the analytical approach to a particular topic area. The impact on results was measured in terms of how additional effort beyond the baseline level changed the relative importance and ordering of dominant accident sequences compared to what would have been observed had methods corresponding to the baseline level of effort been employed. This measure of impact is a more useful indicator of how methods affect perceptions of plant vulnerabilities than changes in core melt frequency would be. However, the change in core melt frequency was used as a secondary measure of impact for nine topics where availability of information permitted. Results are presented primarily in the form of effort-impact matrices for each of the twenty topic areas. A suggested effort-impact profile for future PRAs is presented

  12. Are methodological quality and completeness of reporting associated with citation-based measures of publication impact? A secondary analysis of a systematic review of dementia biomarker studies.

    Science.gov (United States)

    Mackinnon, Shona; Drozdowska, Bogna A; Hamilton, Michael; Noel-Storr, Anna H; McShane, Rupert; Quinn, Terry

    2018-03-22

    To determine whether methodological and reporting quality are associated with surrogate measures of publication impact in the field of dementia biomarker studies. We assessed dementia biomarker studies included in a previous systematic review in terms of methodological and reporting quality using the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) and Standards for Reporting of Diagnostic Accuracy (STARD), respectively. We extracted additional study and journal-related data from each publication to account for factors shown to be associated with impact in previous research. We explored associations between potential determinants and measures of publication impact in univariable and stepwise multivariable linear regression analyses. We aimed to collect data on four measures of publication impact: two traditional measures-average number of citations per year and 5-year impact factor of the publishing journal and two alternative measures-the Altmetric Attention Score and counts of electronic downloads. The systematic review included 142 studies. Due to limited data, Altmetric Attention Scores and electronic downloads were excluded from the analysis, leaving traditional metrics as the only analysed outcome measures. We found no relationship between QUADAS and traditional metrics. Citation rates were independently associated with 5-year journal impact factor (β=0.42; pcitation rates (β=0.45; pCitation rates and 5-year journal impact factor appear to measure different dimensions of impact. Citation rates were weakly associated with completeness of reporting, while neither traditional metric was related to methodological rigour. Our results suggest that high publication usage and journal outlet is not a guarantee of quality and readers should critically appraise all papers regardless of presumed impact. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted

  13. Assessment methodology applicable to safe decommissioning of Romanian VVR-S research reactor

    International Nuclear Information System (INIS)

    Baniu, O.; Vladescu, G.; Vidican, D.; Penescu, M.

    2002-01-01

    The paper contains the results of research activity performed by CITON specialists regarding the assessment methodology intended to be applied to safe decommissioning of the research reactors, developed taking into account specific conditions of the Romanian VVR-S Research Reactor. The Romanian VVR-S Research Reactor is an old reactor (1957) and its Decommissioning Plan is under study. The main topics of paper are as follows: Safety approach of nuclear facilities decommissioning. Applicable safety principles; Main steps of the proposed assessment methodology; Generic content of Decommissioning Plan. Main decommissioning activities. Discussion about the proposed Decommissioning Plan for Romanian Research Reactor; Safety risks which may occur during decommissioning activities. Normal decommissioning operations. Fault conditions. Internal and external hazards; Typical development of a scenario. Features, Events and Processes List. Exposure pathways. Calculation methodology. (author)

  14. International Comparisons: Issues of Methodology and Practice

    Directory of Open Access Journals (Sweden)

    Serova Irina A.

    2017-12-01

    Full Text Available The article discusses the methodology and organization of statistical observation of the level of countries’ economic development. The theoretical basis of international comparisons is singled out and on its basis the comparative evaluation of inconsistency of theoretical positions and the reasons of differences of GDP growth is carried out. Based on the complexity of the formation of homogeneous data sets in order to obtain correct comparison results, a general scheme for the relationship between the theoretical base of international comparisons and PPP constraints is defined. The possibility of obtaining a single measurement of the indicators of national economies based on the existing sampling errors, measurement uncertainties and classification errors is considered. The emphasis is placed on combining the work using the ICP and CPI with the aim of achieving comparability of data in the territorial and temporal cross-section. Using the basic characteristics of sustainable economic growth, long-term prospects for changing the ranking positions of countries with different levels of income are determined. It is shown that the clarity and unambiguity of the theoretical provisions is the defining condition for the further process of data collection and formation of correct analytical conclusions.

  15. Identification of voltage stability condition of a power system using measurements of bus variables

    Directory of Open Access Journals (Sweden)

    Durlav Hazarika

    2014-12-01

    Full Text Available Several online methods were proposed for investigating the voltage stability condition of an interconnected power system using the measurements of voltage and current phasors at a bus. For this purpose, phasor measurement units (PMUs are used. A PMU is a device which measures the electrical waves on an electrical network, using a common time source (reference bus for synchronisation. This study proposes a method for online monitoring of voltage stability condition of a power system using measurements of bus variables namely – (i real power, (ii reactive power and (iii bus voltage magnitude at a bus. The measurements of real power, reactive power and bus voltage magnitude could be extracted/captured from a smart energy meter. The financial involvement for implementation of the proposed method would significantly lower compared with the PMU-based method.

  16. In-core program for on line measurements of neutron, photon and nuclear heating parameters inside Jules Horowitz MTR reactor

    International Nuclear Information System (INIS)

    Lyoussi, A.; Reynard-Carette, C.

    2014-01-01

    Accurate on-line measurements of key parameters inside experimental channels of Material Testing Reactor are necessary to dimension the irradiation devices and consequently to conduct smart experiments on fuels and materials under suitable conditions. In particular the quantification of nuclear heating, a relevant parameter to reach adapted thermal conditions, has to be improved. These works focus on an important collaborative program between CEA and Aix-Marseille University called INCORE (Instrumentation for Nuclear radiations and Calorimetry On-line in Reactor) dedicated to the development of a new measurement methodology to quantify both nuclear heating and accurate radiation flux levels (neutrons and photons). The methodology, which is based on experiments carried out under irradiation conditions with a multi-sensor device (ionization chamber, fission chamber, gamma thermometer, calorimeter, SPND, SPGD) as well as works performed out-of nuclear/radiative environment on a reference sensor used to measure nuclear heating (calorimeter), is presented (authors)

  17. Enabling Mobile Communications for the Needy: Affordability Methodology, and Approaches to Requalify Universal Service Measures

    Directory of Open Access Journals (Sweden)

    Louis-Francois PAU

    2009-01-01

    Full Text Available This paper links communications and media usage to social and household economics boundaries. It highlights that in present day society, communications and media are a necessity, but not always affordable, and that they furthermore open up for addictive behaviors which raise additional financial and social risks. A simple and efficient methodology compatible with state-of-the-art social and communications business statistics is developed, which produces the residual communications and media affordability budget and ultimately the value-at-risk in terms of usage and tariffs. Sensitivity analysis provides precious information on bottom-up communications and media adoption on the basis of affordability. This approach differs from the regulated but often ineffective Universal service obligation, which instead of catering for individual needs mostly addresses macro-measures helping geographical access coverage (e.g. in rural areas. It is proposed to requalify the Universal service obligations on operators into concrete measures, allowing, with unchanged funding, the needy to adopt mobile services based on their affordability constraints by bridging the gap to a standard tariff. Case data are surveyed from various countries. ICT policy recommendations are made to support widespread and socially responsible communications access.

  18. Development of a methodology for conducting an integrated HRA/PRA --

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. (Brookhaven National Lab., Upton, NY (United States)); Wreathall, J. (Wreathall (John) and Co., Dublin, OH (United States)); Cooper, S.E. (Science Applications International Corp., McLean, VA (United States))

    1993-01-01

    During Low Power and Shutdown (LP S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP S, (2) identification of potentially important LP S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP S conditions for a pressurized water reactor (PWR).

  19. Effect of measurement conditions on three-dimensional roughness values, and development of measurement standard

    International Nuclear Information System (INIS)

    Fabre, A; Brenier, B; Raynaud, S

    2011-01-01

    Friction or corrosion behaviour, fatigue lifetime for mechanical components are influenced by their boundary and subsurface properties. The surface integrity is studied on mechanical component in order to improve the service behaviour of them. Roughness is one of the main geometrical properties, which is to be qualified and quantified. Components can be obtained using a complex process: forming, machining and treatment can be combined to realize parts with complex shape. Then, three-dimensional roughness is needed to characterize these parts with complex shape and textured surface. With contact or non-contact measurements (contact stylus, confocal microprobe, interferometer), three-dimensional roughness is quantified using the calculation of pertinent parameters defined by the international standard PR EN ISO 25178-2:2008. An analysis will identify the influence of measurement conditions on three-dimensional parameters. The purpose of this study is to analyse the variation of roughness results using contact stylus or optical apparatus. The second aim of this work is to develop a measurement standard well adapted to qualify the contact and non-contact apparatus.

  20. Assessing digital control system dependability using the dynamic flowgraph methodology

    International Nuclear Information System (INIS)

    Garrett, C.J.; Guarro, S.B.; Apostolakis, G.E.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a methodological approach to modeling and analyzing the behavior of software-driven embedded systems for the purpose of reliability/safety assessment and verification. The methodology has two fundamental goals: (a) to identify how certain postulated events may occur in a system and (b) to identify an appropriate testing strategy based on an analysis of system functional behavior. To achieve these goals, the methodology employs a modeling framework in which system models are developed in terms of causal relationships between physical variables and temporal characteristics of the execution of software modules. These models are then analyzed to determine how a certain state (desirable or undesirable) can be reached. This is done by developing timed fault trees, which take the form of logical combinations of static trees relating system parameters at different points in time. The prime implicants (multistate analog of minimal cut sets) of the fault trees can be used to identify and eliminate system faults resulting from unanticipated combinations of software logic errors, hardware failures, and adverse environmental conditions and to direct testing activity to more efficiently eliminate implementation errors by focusing on the neighborhood of potential failure modes arising from these combinations of system conditions

  1. Methodologies Developed for EcoCity Related Projects: New Borg El Arab, an Egyptian Case Study

    Directory of Open Access Journals (Sweden)

    Carmen Antuña-Rozado

    2016-08-01

    Full Text Available The aim of the methodologies described here is to propose measures and procedures for developing concepts and technological solutions, which are adapted to the local conditions, to build sustainable communities in developing countries and emerging economies. These methodologies are linked to the EcoCity framework outlined by VTT Technical Research Centre of Finland Ltd. for sustainable community and neighbourhood regeneration and development. The framework is the result of a long experience in numerous EcoCity related projects, mainly Nordic and European in scope, which has been reformulated in recent years to respond to the local needs in the previously mentioned countries. There is also a particular emphasis on close collaboration with local partners and major stakeholders. In order to illustrate how these methodologies can support EcoCity concept development and implementation, results from a case study in Egypt will be discussed. The referred case study relates to the transformation of New Borg El Arab (NBC, near Alexandria, into an EcoCity. The viability of the idea was explored making use of different methodologies (Roadmap, Feasibility Study, and Residents Energy Survey and Building Consumption Assessment and considering the Residential, Commercial/Public Facilities, Industrial, Services/Utilities, and Transport sectors.

  2. Innovative Methodologies for thermal Energy Release Measurement: case of La Solfatara volcano (Italy)

    Science.gov (United States)

    Marfe`, Barbara; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Marotta, Enrica; Peluso, Rosario

    2015-04-01

    This work is devoted to improve the knowledge on the parameters that control the heat flux anomalies associated with the diffuse degassing processes of volcanic and hydrothermal areas. The methodologies currently used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. A new method, based on the use of thermal imaging cameras, has been applied to estimate the heat flux and its time variations. This approach will allow faster heat flux measurement than already accredited methods, improving in this way the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The idea is to extrapolate the heat flux from the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. We use thermal imaging cameras, at short distances (meters to hundreds of meters), to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature. Preliminary studies have been carried out throughout the whole of the La Solfatara crater in order to investigate a possible correlation between the surface temperature and the shallow thermal gradient. We have used a FLIR SC640 thermal camera and K type thermocouples to assess the two measurements at the same time. Results suggest a good correlation between the shallow temperature gradient ΔTs and the surface temperature Ts depurated from background, and despite the campaigns took place during a period of time of a few years, this correlation seems to be stable over the time. This is an extremely motivating result for a further development of a measurement method based only on the use of small range thermal imaging camera. Surveys with thermal cameras may be manually done using a tripod to take thermal images of small contiguous areas and then joining

  3. Fluorescent nanosensors for intracellular measurements: synthesis, characterisation, calibration and measurement

    Directory of Open Access Journals (Sweden)

    Arpan Shailesh Desai

    2014-01-01

    Full Text Available Measurement of intracellular acidification is important for understanding fundamental biological pathways as well as developing effective therapeutic strategies. Fluorescent pH nanosensors are an enabling technology for real-time monitoring of intracellular acidification. The physicochemical characteristics of nanosensors can be engineered to target specific cellular compartments and respond to external stimuli. Therefore nanosensors represent a versatile approach for probing biological pathways inside cells. The fundamental components of nanosensors comprise a pH-sensitive fluorophore (signal transducer and a pH-insensitive reference fluorophore (internal standard immobilised in an inert non-toxic matrix. The inert matrix prevents interference of cellular components with the sensing elements as well as minimizing potentially harmful effects of some fluorophores on cell function. Fluorescent nanosensors are synthesised using standard laboratory equipment and are detectable by non-invasive widely accessibly imaging techniques. The outcomes of studies employing this technology are dependent on reliable methodology for performing measurements. In particular special consideration must be given to conditions for sensor calibration, uptake conditions and parameters for image analysis. We describe procedures for: 1 synthesis and characterisation of polyacrylamide and silica based nanosensors 2 nanosensor calibration and 3 performing measurements using fluorescence microscopy.

  4. Level-Set Methodology on Adaptive Octree Grids

    Science.gov (United States)

    Gibou, Frederic; Guittet, Arthur; Mirzadeh, Mohammad; Theillard, Maxime

    2017-11-01

    Numerical simulations of interfacial problems in fluids require a methodology capable of tracking surfaces that can undergo changes in topology and capable to imposing jump boundary conditions in a sharp manner. In this talk, we will discuss recent advances in the level-set framework, in particular one that is based on adaptive grids.

  5. Accounting for the temperature conditions during deep prospecting hole drilling

    Energy Technology Data Exchange (ETDEWEB)

    Shcherban, A N; Cheniak, V P; Zolotarenko, U P

    1977-01-01

    A methodology is described for calculating and controlling the temperature in inclined holes in order to establish a non-steady-state heat exchange between the medium circulating in the hole, and the construction components and rock. In order to verify the proposed methodology, the temperature of the drilling fluid is measured directly during the drilling process using a specially-designed automatic device which is lowered into the hole with the drilling string and turned on automatically at a given depth. This device makes it possible to record the drilling fluid temperature on magnetic tape, and convert the sensor signals arriving from the drilling string and the annular space. A comparison of calculation and experimental data confirmed the sufficiently high accuracy of the methods for predicting the thermal conditions in drilling deep prospecting holes.

  6. Overview of a performance assessment methodology for low-level radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Kozak, M.W.; Chu, M.S.Y.

    1991-01-01

    A performance assessment methodology has been developed for use by the US Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. This paper provides a summary and an overview of the modeling approaches selected for the methodology. The overview includes discussions of the philosophy and structure of the methodology. This performance assessment methodology is designed to provide the NRC with a tool for performing confirmatory analyses in support of license reviews related to postclosure performance. The methodology allows analyses of dose to individuals from off-site releases under normal conditions as well as on-site doses to inadvertent intruders. 24 refs., 1 tab

  7. Thrust Measurement of Dielectric Barrier Discharge (DBD) Plasma Actuators: New Anti-Thrust Hypothesis, Frequency Sweeps Methodology, Humidity and Enclosure Effects

    Science.gov (United States)

    Ashpis, David E.; Laun, Matthew C.

    2014-01-01

    We discuss thrust measurements of Dielectric Barrier Discharge (DBD) plasma actuators devices used for aerodynamic active flow control. After a review of our experience with conventional thrust measurement and significant non-repeatability of the results, we devised a suspended actuator test setup, and now present a methodology of thrust measurements with decreased uncertainty. The methodology consists of frequency scans at constant voltages. The procedure consists of increasing the frequency in a step-wise fashion from several Hz to the maximum frequency of several kHz, followed by frequency decrease back down to the start frequency of several Hz. This sequence is performed first at the highest voltage of interest, then repeated at lower voltages. The data in the descending frequency direction is more consistent and selected for reporting. Sample results show strong dependence of thrust on humidity which also affects the consistency and fluctuations of the measurements. We also observed negative values of thrust, or "anti-thrust", at low frequencies between 4 Hz and up to 64 Hz. The anti-thrust is proportional to the mean-squared voltage and is frequency independent. Departures from the parabolic anti-thrust curve are correlated with appearance of visible plasma discharges. We propose the anti-thrust hypothesis. It states that the measured thrust is a sum of plasma thrust and anti-thrust, and assumes that the anti-thrust exists at all frequencies and voltages. The anti-thrust depends on actuator geometry and materials and on the test installation. It enables the separation of the plasma thrust from the measured total thrust. This approach enables more meaningful comparisons between actuators at different installations and laboratories. The dependence on test installation was validated by surrounding the actuator with a grounded large-diameter metal sleeve. Strong dependence on humidity is also shown; the thrust significantly increased with decreasing humidity, e

  8. Comparison of fungal spores concentrations measured with wideband integrated bioaerosol sensor and Hirst methodology

    Science.gov (United States)

    Fernández-Rodríguez, S.; Tormo-Molina, R.; Lemonis, N.; Clot, B.; O'Connor, D. J.; Sodeau, John R.

    2018-02-01

    The aim of this work was to provide both a comparison of traditional and novel methodologies for airborne spores detection (i.e. the Hirst Burkard trap and WIBS-4) and the first quantitative study of airborne fungal concentrations in Payerne (Western Switzerland) as well as their relation to meteorological parameters. From the traditional method -Hirst trap and microscope analysis-, sixty-three propagule types (spores, sporangia and hyphae) were identified and the average spore concentrations measured over the full period amounted to 4145 ± 263.0 spores/m3. Maximum values were reached on July 19th and on August 6th. Twenty-six spore types reached average levels above 10 spores/m3. Airborne fungal propagules in Payerne showed a clear seasonal pattern, increasing from low values in early spring to maxima in summer. Daily average concentrations above 5000 spores/m3 were almost constant in summer from mid-June onwards. Weather parameters showed a relevant role for determining the observed spore concentrations. Coniferous forest, dominant in the surroundings, may be a relevant source for airborne fungal propagules as their distribution and predominant wind directions are consistent with the origin. The comparison between the two methodologies used in this campaign showed remarkably consistent patterns throughout the campaign. A correlation coefficient of 0.9 (CI 0.76-0.96) was seen between the two over the time period for daily resolutions (Hirst trap and WIBS-4). This apparent co-linearity was seen to fall away once increased resolution was employed. However at higher resolutions upon removal of Cladosporium species from the total fungal concentrations (Hirst trap), an increased correlation coefficient was again noted between the two instruments (R = 0.81 with confidence intervals of 0.74 and 0.86).

  9. ''Training plan optimized design'' methodology application to IBERDROLA - Power generation

    International Nuclear Information System (INIS)

    Gil, S.; Mendizabal, J.L.

    1996-01-01

    The trend in both Europe and the United States, towards the understanding that no training plan may be considered suitable if not backed by the results of application of the S.A.T. (Systematic Approach to Training) methodology, led TECNATOM, S.A. to apply thy methodology through development of an application specific to the conditions of the Spanish working system. The requirement that design of the training be coherent with the realities of the working environment is met by systematic application of the SAT methodology as part of the work analysis and job-based task analysis processes, this serving as a basis for design of the training plans

  10. Specific absorption rate determination of magnetic nanoparticles through hyperthermia measurements in non-adiabatic conditions

    Energy Technology Data Exchange (ETDEWEB)

    Coïsson, M. [INRIM, strada delle Cacce 91, 10135 Torino (Italy); Barrera, G. [INRIM, strada delle Cacce 91, 10135 Torino (Italy); University of Torino, Chemistry Department, via P. Giuria 7, 10125 Torino (Italy); Celegato, F.; Martino, L.; Vinai, F. [INRIM, strada delle Cacce 91, 10135 Torino (Italy); Martino, P. [Politronica srl, via Livorno 60, 10144 Torino (Italy); Ferraro, G. [Center for Space Human Robotics, Istituto Italiano di Tecnologia - IIT, corso Trento 21, 10129 Torino (Italy); Tiberto, P. [INRIM, strada delle Cacce 91, 10135 Torino (Italy)

    2016-10-01

    An experimental setup for magnetic hyperthermia operating in non-adiabatic conditions is described. A thermodynamic model that takes into account the heat exchanged by the sample with the surrounding environment is developed. A suitable calibration procedure is proposed that allows the experimental validation of the model. Specific absorption rate can then be accurately determined just from the measurement of the sample temperature at the equilibrium steady state. The setup and the measurement procedure represent a simplification with respect to other systems requiring calorimeters or crucial corrections for heat flow. Two families of magnetic nanoparticles, one superparamagnetic and one characterised by larger sizes and static hysteresis, have been characterised as a function of field intensity, and specific absorption rate and intrinsic loss power have been obtained. - Highlights: • Development and thermodynamic modelling of a hyperthermia setup operating in non-adiabatic conditions. • Calibration of the experimental setup and validation of the model. • Accurate measurement of specific absorption rate and intrinsic loss power in non-adiabatic conditions.

  11. Health economic assessment: a methodological primer.

    Science.gov (United States)

    Simoens, Steven

    2009-12-01

    This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs), an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis), and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments.

  12. Health Economic Assessment: A Methodological Primer

    Directory of Open Access Journals (Sweden)

    Steven Simoens

    2009-11-01

    Full Text Available This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs, an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis, and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments.

  13. Methodology for generating waste volume estimates

    International Nuclear Information System (INIS)

    Miller, J.Q.; Hale, T.; Miller, D.

    1991-09-01

    This document describes the methodology that will be used to calculate waste volume estimates for site characterization and remedial design/remedial action activities at each of the DOE Field Office, Oak Ridge (DOE-OR) facilities. This standardized methodology is designed to ensure consistency in waste estimating across the various sites and organizations that are involved in environmental restoration activities. The criteria and assumptions that are provided for generating these waste estimates will be implemented across all DOE-OR facilities and are subject to change based on comments received and actual waste volumes measured during future sampling and remediation activities. 7 figs., 8 tabs

  14. Influence of activated carbon characteristics on toluene and hexane adsorption: Application of surface response methodology

    Science.gov (United States)

    Izquierdo, Mª Teresa; de Yuso, Alicia Martínez; Valenciano, Raquel; Rubio, Begoña; Pino, Mª Rosa

    2013-01-01

    The objective of this study was to evaluate the adsorption capacity of toluene and hexane over activated carbons prepared according an experimental design, considering as variables the activation temperature, the impregnation ratio and the activation time. The response surface methodology was applied to optimize the adsorption capacity of the carbons regarding the preparation conditions that determine the physicochemical characteristics of the activated carbons. The methodology of preparation produced activated carbons with surface areas and micropore volumes as high as 1128 m2/g and 0.52 cm3/g, respectively. Moreover, the activated carbons exhibit mesoporosity, ranging from 64.6% to 89.1% the percentage of microporosity. The surface chemistry was characterized by TPD, FTIR and acid-base titration obtaining different values of surface groups from the different techniques because the limitation of each technique, but obtaining similar trends for the activated carbons studied. The exhaustive characterization of the activated carbons allows to state that the measured surface area does not explain the adsorption capacity for either toluene or n-hexane. On the other hand, the surface chemistry does not explain the adsorption results either. A compromise between physical and chemical characteristics can be obtained from the appropriate activation conditions, and the response surface methodology gives the optimal activated carbon to maximize adsorption capacity. Low activation temperature, intermediate impregnation ratio lead to high toluene and n-hexane adsorption capacities depending on the activation time, which a determining factor to maximize toluene adsorption.

  15. Research methodology used in studies of child disaster mental health interventions for posttraumatic stress.

    Science.gov (United States)

    Pfefferbaum, Betty; Newman, Elana; Nelson, Summer D; Liles, Brandi D; Tett, Robert P; Varma, Vandana; Nitiéma, Pascal

    2014-01-01

    In the last decade, the development of community-based and clinical interventions to assist children and adolescents after a disaster has become an international priority. Clinicians and researchers have begun to scientifically evaluate these interventions despite challenging conditions. The objective of this study was to conduct a systematic review of the research methodology used in studies of child disaster mental health interventions for posttraumatic stress. This scientifically rigorous analysis used standards for methodological rigor of psychosocial treatments for posttraumatic stress disorder (PTSD) to examine 29 intervention studies. This analysis revealed that further refinement of methodology is needed to determine if certain intervention approaches are superior to other approaches and if they provide benefit beyond natural recovery. Most studies (93.1%) clearly described the interventions being tested or used manuals to guide application and most (89.7%) used standardized instruments to measure outcomes, and many used random assignment (69.0%) and provided assessor training (65.5%). Fewer studies used blinded assessment (44.8%) or measured treatment adherence (48.3%), and sample size in most studies (82.8%) was not adequate to detect small effects generally expected when comparing two active interventions. Moreover, it is unclear what constitutes meaningful change in relation to treatment especially for the numerous interventions administered to children in the general population. Overall, the results are inconclusive about which children, what settings, and what approaches are most likely to be beneficial. © 2014.

  16. Experimental measurements of the solubility of technetium under near-field conditions

    International Nuclear Information System (INIS)

    Pilkington, N.J.; Wilkins, J.D.

    1988-05-01

    The solubility of technetium in contact with hydrated technetium dioxide under near-field conditions has been measured experimentally. The values obtained were changed little by a change in pH or in the filtration method used. The presence of organic degradation products increased slightly the solution concentration of technetium. (author)

  17. Experimental methodology for obtaining sound absorption coefficients

    Directory of Open Access Journals (Sweden)

    Carlos A. Macía M

    2011-07-01

    Full Text Available Objective: the authors propose a new methodology for estimating sound absorption coefficients using genetic algorithms. Methodology: sound waves are generated and conducted along a rectangular silencer. The waves are then attenuated by the absorbing material covering the silencer’s walls. The attenuated sound pressure level is used in a genetic algorithm-based search to find the parameters of the proposed attenuation expressions that include geometric factors, the wavelength and the absorption coefficient. Results: a variety of adjusted mathematical models were found that make it possible to estimate the absorption coefficients based on the characteristics of a rectangular silencer used for measuring the attenuation of the noise that passes through it. Conclusions: this methodology makes it possible to obtain the absorption coefficients of new materials in a cheap and simple manner. Although these coefficients might be slightly different from those obtained through other methodologies, they provide solutions within the engineering accuracy ranges that are used for designing noise control systems.

  18. A reverse engineering methodology for nickel alloy turbine blades with internal features

    DEFF Research Database (Denmark)

    Gameros, A.; De Chiffre, Leonardo; Siller, H.R.

    2015-01-01

    The scope of this work is to present a reverse engineering (RE) methodology for freeform surfaces, based on a case study of a turbine blade made of Inconel, including the reconstruction of its internal cooling system. The methodology uses an optical scanner and X-ray computed tomography (CT......) equipment. Traceability of the measurements was obtained through the use of a Modular Freeform Gage (MFG). An uncertainty budget is presented for both measuring technologies and results show that the RE methodology presented is promising when comparing uncertainty values against common industrial tolerances....

  19. Health economic evaluation: important principles and methodology.

    Science.gov (United States)

    Rudmik, Luke; Drummond, Michael

    2013-06-01

    To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.

  20. Methodology for ranking restoration options

    International Nuclear Information System (INIS)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  1. VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.

    Science.gov (United States)

    Little, Todd D; Wang, Eugene W; Gorrall, Britt K

    2017-06-01

    This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.

  2. Methodology for Evaluating Safety System Operability using Virtual Parameter Network

    International Nuclear Information System (INIS)

    Park, Sukyoung; Heo, Gyunyoung; Kim, Jung Taek; Kim, Tae Wan

    2014-01-01

    KAERI (Korea Atomic Energy Research Institute) and UTK (University of Tennessee Knoxville) are working on the I-NERI project to suggest complement of this problem. This research propose the methodology which provide the alternative signal in case of unable guaranteed reliability of some instrumentation with KAERI. Proposed methodology is assumed that several instrumentations are working normally under the power supply condition because we do not consider the instrumentation survivability itself. Thus, concept of the Virtual Parameter Network (VPN) is used to identify the associations between plant parameters. This paper is extended version of the paper which was submitted last KNS meeting by changing the methodology and adding the result of the case study. In previous research, we used Artificial Neural Network (ANN) inferential technique for estimation model but every time this model showed different estimate value due to random bias each time. Therefore Auto-Associative Kernel Regression (AAKR) model which have same number of inputs and outputs is used to estimate. Also the importance measures in the previous method depend on estimation model but importance measure of improved method independent on estimation model. Also importance index of previous method depended on estimation model but importance index of improved method is independent on estimation model. In this study, we proposed the methodology to identify the internal state of power plant when severe accident happens also it has been validated through case study. SBLOCA which has large contribution to severe accident is considered as initiating event and relationship amongst parameter has been identified. VPN has ability to identify that which parameter has to be observed and which parameter can be alternative to the missing parameter when some instruments are failed in severe accident. In this study we have identified through results that commonly number 2, 3, 4 parameter has high connectivity while

  3. Methodology for Evaluating Safety System Operability using Virtual Parameter Network

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sukyoung; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Kim, Jung Taek [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Tae Wan [Kepco International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-05-15

    KAERI (Korea Atomic Energy Research Institute) and UTK (University of Tennessee Knoxville) are working on the I-NERI project to suggest complement of this problem. This research propose the methodology which provide the alternative signal in case of unable guaranteed reliability of some instrumentation with KAERI. Proposed methodology is assumed that several instrumentations are working normally under the power supply condition because we do not consider the instrumentation survivability itself. Thus, concept of the Virtual Parameter Network (VPN) is used to identify the associations between plant parameters. This paper is extended version of the paper which was submitted last KNS meeting by changing the methodology and adding the result of the case study. In previous research, we used Artificial Neural Network (ANN) inferential technique for estimation model but every time this model showed different estimate value due to random bias each time. Therefore Auto-Associative Kernel Regression (AAKR) model which have same number of inputs and outputs is used to estimate. Also the importance measures in the previous method depend on estimation model but importance measure of improved method independent on estimation model. Also importance index of previous method depended on estimation model but importance index of improved method is independent on estimation model. In this study, we proposed the methodology to identify the internal state of power plant when severe accident happens also it has been validated through case study. SBLOCA which has large contribution to severe accident is considered as initiating event and relationship amongst parameter has been identified. VPN has ability to identify that which parameter has to be observed and which parameter can be alternative to the missing parameter when some instruments are failed in severe accident. In this study we have identified through results that commonly number 2, 3, 4 parameter has high connectivity while

  4. Methodological issues related to studies of lead mobilization during menopause

    Directory of Open Access Journals (Sweden)

    Berkowitz Gertrud S.

    1999-01-01

    Full Text Available While there has been a substantial decline in lead exposure in the United States during the past two decades, mobilization of existing lead stored in bone potentially represents an important endogenous source of exposure for menopausal women. It has been hypothesized that lead may be mobilized from skeletal stores during conditions of high bone turnover, such as during menopause. However, such mobilization has not been documented in prospective studies. This discussion is focussed on some of the methodological difficulties to be anticipated in longitudinal studies of lead mobilization specific to menopause and the issues that need to be taken into account when evaluating the results of such studies. To evaluate whether lead mobilization occurs during menopause, a prospective repeated measures design is needed using X-ray fluorescence analysis of lead in bone and serial measurements of blood lead. Potential confounders and effect modifiers also need to be taken into account in the statistical analysis.

  5. PIV measurements in a compact return diffuser under multi-conditions

    Science.gov (United States)

    Zhou, L.; Lu, W. G.; Shi, W. D.

    2013-12-01

    Due to the complex three-dimensional geometries of impellers and diffusers, their design is a delicate and difficult task. Slight change could lead to significant changes in hydraulic performance and internal flow structure. Conversely, the grasp of the pump's internal flow pattern could benefit from pump design improvement. The internal flow fields in a compact return diffuser have been investigated experimentally under multi-conditions. A special Particle Image Velocimetry (PIV) test rig is designed, and the two-dimensional PIV measurements are successfully conducted in the diffuser mid-plane to capture the complex flow patterns. The analysis of the obtained results has been focused on the flow structure in diffuser, especially under part-load conditions. The vortex and recirculation flow patterns in diffuser are captured and analysed accordingly. Strong flow separation and back flow appeared at the part-load flow rates. Under the design and over-load conditions, the flow fields in diffuser are uniform, and the flow separation and back flow appear at the part-load flow rates, strong back flow is captured at one diffuser passage under 0.2Qdes.

  6. The Ocean Colour Climate Change Initiative: I. A Methodology for Assessing Atmospheric Correction Processors Based on In-Situ Measurements

    Science.gov (United States)

    Muller, Dagmar; Krasemann, Hajo; Brewin, Robert J. W.; Deschamps, Pierre-Yves; Doerffer, Roland; Fomferra, Norman; Franz, Bryan A.; Grant, Mike G.; Groom, Steve B.; Melin, Frederic; hide

    2015-01-01

    The Ocean Colour Climate Change Initiative intends to provide a long-term time series of ocean colour data and investigate the detectable climate impact. A reliable and stable atmospheric correction procedure is the basis for ocean colour products of the necessary high quality. In order to guarantee an objective selection from a set of four atmospheric correction processors, the common validation strategy of comparisons between in-situ and satellite derived water leaving reflectance spectra, is extended by a ranking system. In principle, the statistical parameters such as root mean square error, bias, etc. and measures of goodness of fit, are transformed into relative scores, which evaluate the relationship of quality dependent on the algorithms under study. The sensitivity of these scores to the selected database has been assessed by a bootstrapping exercise, which allows identification of the uncertainty in the scoring results. Although the presented methodology is intended to be used in an algorithm selection process, this paper focusses on the scope of the methodology rather than the properties of the individual processors.

  7. Qualification of a full plant nodalization for the prediction of the core exit temperature through a scaling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, J., E-mail: jordi.freixa-terradas@upc.edu; Martínez-Quiroga, V., E-mail: victor.martinez.quiroga@upc.edu; Reventós, F., E-mail: francesc.reventos@upc.edu

    2016-11-15

    Highlights: • Core exit temperature is used in PWRs as an indication of core heat up. • Qualification of full scale nuclear reactors by means of a scaling methodology. • Scaling of RELAP5 calculations to full scale power plants. - Abstract: System codes and their necessary power plant nodalizations are an essential step in thermal hydraulic safety analysis. In order to assess the safety of a particular power plant, in addition to the validation and verification of the code, the nodalization of the system needs to be qualified. Since most existing experimental data come from scaled-down facilities, any qualification process must therefore address scale considerations. The Group of Thermal Hydraulic Studies at Technical University of Catalonia has developed a scaling-up methodology (SCUP) for the qualification of full-scale nodalizations through a systematic procedure based on the extrapolation of post-test simulations of Integral Test Facility experiments. In the present work, the SCUP methodology will be employed to qualify the nodalization of the AscóNPP, a Pressurized Water Reactor (PWR), for the reproduction of an important safety phenomenon which is the effectiveness of the Core Exit Temperature (CET) as an Accident Management (AM) indicator. Given the difficulties in placing measurements in the core region, CET measurements are used as a criterion for the initiation of safety operational procedures during accidental conditions in PWR. However, the CET response has some limitation in detecting inadequate core cooling simply because the measurement is not taken in the position where the cladding exposure occurs. In order to apply the SCUP methodology, the OECD/NEA ROSA-2 Test 3, an SBLOCA in the hot leg, has been selected as a starting point. This experiment was conducted at the Large Scale Test Facility (LSTF), a facility operated by the Japanese Atomic Energy Agency (JAEA) and was focused on the assessment of the effectiveness of AM actions triggered by

  8. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  9. A combination of body condition measurements is more informative than conventional condition indices: temporal variation in body condition and corticosterone in brown tree snakes (Boiga irregularis).

    Science.gov (United States)

    Waye, Heather L; Mason, Robert T

    2008-02-01

    The body condition index is a common method for quantifying the energy reserves of individual animals. Because good body condition is necessary for reproduction in many species, body condition indices can indicate the potential reproductive output of a population. Body condition is related to glucocorticoid production, in that low body condition is correlated to high concentrations of corticosterone in reptiles. We compared the body condition index and plasma corticosterone levels of brown tree snakes on Guam in 2003 to those collected in 1992/1993 to determine whether that population still showed the chronic stress and poor condition apparent in the earlier study. We also examined the relationship between fat mass, body condition and plasma corticosterone concentrations as indicators of physiological condition of individuals in the population. Body condition was significantly higher in 2003 than in the earlier sample for mature male and female snakes, but not for juveniles. The significantly lower levels of corticosterone in all three groups in 2003 suggests that although juveniles did not have significantly improved energy stores they, along with the mature males and females, were no longer under chronic levels of stress. Although the wet season of 2002 was unusually rainy, low baseline levels of corticosterone measured in 2000 indicate that the improved body condition of snakes in 2003 is likely the result of long-term changes in prey populations rather than annual variation in response to environmental conditions.

  10. Water loss in table grapes: model development and validation under dynamic storage conditions

    Directory of Open Access Journals (Sweden)

    Ericsem PEREIRA

    2017-09-01

    Full Text Available Abstract Water loss is a critical problem affecting the quality of table grapes. Temperature and relative humidity (RH are essential in this process. Although mathematical modelling can be applied to measure constant temperature and RH impacts, it is proved that variations in storage conditions are normally encountered in the cold chain. This study proposed a methodology to develop a weight loss model for table grapes and validate its predictions in non-constant conditions of a domestic refrigerator. Grapes were maintained under controlled conditions and the weight loss was measured to calibrate the model. The model described the water loss process adequately and the validation tests confirmed its predictive ability. Delayed cooling tests showed that estimated transpiration rates in subsequent continuous temperature treatment was not significantly influenced by prior exposure conditions, suggesting that this model may be useful to estimate the weight loss consequences of interruptions in the cold chain.

  11. Vibration condition measure instrument of motor using MEMS accelerometer

    Science.gov (United States)

    Chen, Jun

    2018-04-01

    In this work, a novel vibration condition measure instrument of motor using a digital micro accelerometer is proposed. In order to reduce the random noise found in the data, the sensor modeling is established and also the Kalman filter (KMF) is developed. According to these data from KMF, the maximum vibration displacement is calculated by the integration algorithm with the DC bias removed. The high performance micro controller unit (MCU) is used in the implementation of controller. By the IIC digital interface port, the data are transmitted from sensor to controller. The hardware circuits of the sensor and micro controller are designed and tested. With the computational formula of maximum displacement and FFT, the high precession results of displacement and frequency are gotten. Finally, the paper presents various experimental results to prove that this instrument is suitable for application in electrical motor vibration measurement.

  12. Methodology for determining influence of organizational culture to business performance

    Directory of Open Access Journals (Sweden)

    Eva Skoumalová

    2007-01-01

    Full Text Available Content this article is to propose the possible methodology for quantitative measuring the organizational culture using the set of statistical methods. In view of aim we elected procedure consisting of two major sections. The first is classification of organizational culture and role of quantitative measurement on organizational culture. This part includes definition and several methods used to classify organizational culture: Hofstede, Peters and Waterman, Deal and Kennedy, Edgar Schein, Kotter and Heskett, Lukášová and opinions why a measurement perspective is worthwhile. The second major section contains methodology for measuring the organizational culture and its impact on organizational performance. We suggest using structural equation modeling for quantitative assessment of organizational culture.

  13. The Impact of Indoor and Outdoor Radiometer Calibration on Solar Measurements: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron; Sengupta, Manajit; Andreas, Afshin; Reda, Ibrahim; Robinson, Justin

    2016-07-01

    Accurate solar radiation data sets are critical to reducing the expenses associated with mitigating performance risk for solar energy conversion systems, and they help utility planners and grid system operators understand the impacts of solar resource variability. The accuracy of solar radiation measured by radiometers depends on the instrument performance specification, installation method, calibration procedure, measurement conditions, maintenance practices, location, and environmental conditions. This study addresses the effect of calibration methodologies and the resulting calibration responsivities provided by radiometric calibration service providers such as the National Renewable Energy Laboratory (NREL) and manufacturers of radiometers. Some of these radiometers are calibrated indoors, and some are calibrated outdoors. To establish or understand the differences in calibration methodology, we processed and analyzed field-measured data from these radiometers. This study investigates calibration responsivities provided by NREL's broadband outdoor radiometer calibration (BORCAL) and a few prominent manufacturers. The reference radiometer calibrations are traceable to the World Radiometric Reference. These different methods of calibration demonstrated 1% to 2% differences in solar irradiance measurement. Analyzing these values will ultimately assist in determining the uncertainties of the radiometer data and will assist in developing consensus on a standard for calibration.

  14. Influence of wind conditions on wind turbine loads and measurement of turbulence using lidars

    NARCIS (Netherlands)

    Sathe, A.R.

    2012-01-01

    Variations in wind conditions influence the loads on wind turbines significantly. In order to determine these loads it is important that the external conditions are well understood. Wind lidars are well developed nowadays to measure wind profiles upwards from the surface. But how turbulence can be

  15. METHODOLOGIES FOR ASSESSING THE EFFECTIVENESS OF MEDICAL ORGANIZATIONS THAT PROVIDE OUTPATIENT CARE

    Directory of Open Access Journals (Sweden)

    Mikhail Georgievich Karailanov

    2016-08-01

    Full Text Available The aim of this study is to analyze the data in the literature, allows to define the basic methodological approaches to the assessment of the effectiveness of health care organizations, as well as important problems of studying the effectiveness of primary health care at the moment. Primary health care is an integral part of the national health system, as a basis for health care delivery system, and includes measures for prevention, diagnosis, treatment of diseases and conditions, medical rehabilitation, monitoring of pregnancy, healthy lifestyles, including reduce risk factors for disease. In the modern development of the health priority and remains the problem of assessing the effectiveness of the medical organization. Health Management is impossible without the identification of priority targets, indicators and parameters to achieve their efficient use of financial, material and human resources, which leads to the need for a methodology for assessing the effectiveness of health interventions that will ensure the relationship management processes and planning, as well as to solve practical problems of the industry.

  16. Evaluation methodology based on physical security assessment results: a utility theory approach

    International Nuclear Information System (INIS)

    Bennett, H.A.; Olascoaga, M.T.

    1978-03-01

    This report describes an evaluation methodology which aggregates physical security assessment results for nuclear facilities into an overall measure of adequacy. This methodology utilizes utility theory and conforms to a hierarchical structure developed by the NRC. Implementation of the methodology is illustrated by several examples. Recommendations for improvements in the evaluation process are given

  17. Pan-oral dose assessment: a comparative report of methodologies

    International Nuclear Information System (INIS)

    Shafford, J.; Pryor, M.; Hollaway, P.; Peet, D.; Oduko, J.

    2015-01-01

    National guidance from the Institute of Physics and Engineering in Medicine (IPEM Report 91) currently recommends that the patient dose for a pan-oral X-ray unit is measured as dose area product (DAP) replacing dose width product described in earlier guidance. An investigation identifying different methods available to carry out this measurement has been undertaken and errors in the methodologies analysed. It has been shown that there may be up to a 30 % variation in DAP measurement between methods. This paper recommends that where possible a DAP meter is used to measure the dose-area product from a pan-oral X-ray unit to give a direct DAP measurement. However, by using a solid-state dose measurement and film/ruler to calculate DAP the authors have established a conversion factor of 1.4. It is strongly recommended that wherever a DAP value is quoted the methodology used to obtain that value is also reported. (authors)

  18. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  19. Characterizing wood-plastic composites via data-driven methodologies

    Science.gov (United States)

    John G. Michopoulos; John C. Hermanson; Robert Badaliance

    2007-01-01

    The recent increase of wood-plastic composite materials in various application areas has underlined the need for an efficient and robust methodology to characterize their nonlinear anisotropic constitutive behavior. In addition, the multiplicity of various loading conditions in structures utilizing these materials further increases the need for a characterization...

  20. Optimization of synthesis conditions of PbS thin films grown by chemical bath deposition using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Yücel, Ersin, E-mail: dr.ersinyucel@gmail.com [Department of Physics, Faculty of Arts and Sciences, Mustafa Kemal University, 31034 Hatay (Turkey); Yücel, Yasin; Beleli, Buse [Department of Chemistry, Faculty of Arts and Sciences, Mustafa Kemal University, 31034 Hatay (Turkey)

    2015-09-05

    Highlights: • For the first time, RSM and CCD used for optimization of PbS thin film. • Tri-sodium citrate, deposition time and temperature were independent variables. • PbS thin film band gap value was 2.20 eV under the optimum conditions. • Quality of the film was improved after chemometrics optimization. - Abstract: In this study, PbS thin films were synthesized by chemical bath deposition (CBD) under different deposition parameters. Response surface methodology (RSM) was used to optimize synthesis parameters including amount of tri-sodium citrate (0.2–0.8 mL), deposition time (14–34 h) and deposition temperature (26.6–43.4 °C) for deposition of the films. 5-level-3-factor central composite design (CCD) was employed to evaluate effects of the deposition parameters on the response (optical band gap of the films). The significant level of both the main effects and the interaction are investigated by analysis of variance (ANOVA). The film structures were characterized by X-ray diffractometer (XRD). Morphological properties of the films were studied with a scanning electron microscopy (SEM). The optical properties of the films were investigated using a UV–visible spectrophotometer. The optimum amount of tri-sodium citrate, deposition time and deposition temperature were found to be 0.7 mL, 18.07 h and 30 °C respectively. Under these conditions, the experimental band gap of PbS was 2.20 eV, which is quite good correlation with value (1.98 eV) predicted by the model.

  1. Optimization of synthesis conditions of PbS thin films grown by chemical bath deposition using response surface methodology

    International Nuclear Information System (INIS)

    Yücel, Ersin; Yücel, Yasin; Beleli, Buse

    2015-01-01

    Highlights: • For the first time, RSM and CCD used for optimization of PbS thin film. • Tri-sodium citrate, deposition time and temperature were independent variables. • PbS thin film band gap value was 2.20 eV under the optimum conditions. • Quality of the film was improved after chemometrics optimization. - Abstract: In this study, PbS thin films were synthesized by chemical bath deposition (CBD) under different deposition parameters. Response surface methodology (RSM) was used to optimize synthesis parameters including amount of tri-sodium citrate (0.2–0.8 mL), deposition time (14–34 h) and deposition temperature (26.6–43.4 °C) for deposition of the films. 5-level-3-factor central composite design (CCD) was employed to evaluate effects of the deposition parameters on the response (optical band gap of the films). The significant level of both the main effects and the interaction are investigated by analysis of variance (ANOVA). The film structures were characterized by X-ray diffractometer (XRD). Morphological properties of the films were studied with a scanning electron microscopy (SEM). The optical properties of the films were investigated using a UV–visible spectrophotometer. The optimum amount of tri-sodium citrate, deposition time and deposition temperature were found to be 0.7 mL, 18.07 h and 30 °C respectively. Under these conditions, the experimental band gap of PbS was 2.20 eV, which is quite good correlation with value (1.98 eV) predicted by the model

  2. Measurement of Two-Phase Flow Characteristics Under Microgravity Conditions

    Science.gov (United States)

    Keshock, E. G.; Lin, C. S.; Edwards, L. G.; Knapp, J.; Harrison, M. E.; Xhang, X.

    1999-01-01

    This paper describes the technical approach and initial results of a test program for studying two-phase annular flow under the simulated microgravity conditions of KC-135 aircraft flights. A helical coil flow channel orientation was utilized in order to circumvent the restrictions normally associated with drop tower or aircraft flight tests with respect to two-phase flow, namely spatial restrictions preventing channel lengths of sufficient size to accurately measure pressure drops. Additionally, the helical coil geometry is of interest in itself, considering that operating in a microgravity environment vastly simplifies the two-phase flows occurring in coiled flow channels under 1-g conditions for virtually any orientation. Pressure drop measurements were made across four stainless steel coil test sections, having a range of inside tube diameters (0.95 to 1.9 cm), coil diameters (25 - 50 cm), and length-to-diameter ratios (380 - 720). High-speed video photographic flow observations were made in the transparent straight sections immediately preceding and following the coil test sections. A transparent coil of tygon tubing of 1.9 cm inside diameter was also used to obtain flow visualization information within the coil itself. Initial test data has been obtained from one set of KC-135 flight tests, along with benchmark ground tests. Preliminary results appear to indicate that accurate pressure drop data is obtainable using a helical coil geometry that may be related to straight channel flow behavior. Also, video photographic results appear to indicate that the observed slug-annular flow regime transitions agree quite reasonably with the Dukler microgravity map.

  3. Development of a methodology for conducting an integrated HRA/PRA --

    International Nuclear Information System (INIS)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S.; Wreathall, J.; Cooper, S.E.

    1993-01-01

    During Low Power and Shutdown (LP ampersand S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP ampersand S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP ampersand S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP ampersand S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP ampersand S, (2) identification of potentially important LP ampersand S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP ampersand S conditions for a pressurized water reactor (PWR)

  4. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    International Nuclear Information System (INIS)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE's) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE's within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ''site'' perception to a more uniform or ''national'' perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticals data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation

  5. Navigating the Process of Ethical Approval: A methodological note

    Directory of Open Access Journals (Sweden)

    Eileen Carey, RNID, BSc. (hons, MSc.

    2010-12-01

    Full Text Available Classic grounded theory (CGT methodology is a general methodology whereby the researcher aims to develop an emergent conceptual theory from empirical data collected by the researcher during the research study. Gaining ethical approval from relevant ethics committees to access such data is the starting point for processing a CGT study. The adoption of the Universal Declaration on Bioethics and Human Rights (UNESCO, 2005 is an indication of global consensus on the importance of research ethics. There is, however, a wide variation of health research systems across countries and disciplines (Hearnshaw 2004. Institutional Research Boards (IRB or Research Ethics Committees (REC have been established in many countries to regulate ethical research ensuring that researchers agree to, and adhere to, specific ethical and methodological conditions prior to ethical approval being granted. Interestingly, both the processes and outcomes through which the methodological aspects pertinent to CGT studies are agreed between the researcher and ethics committee remain largely ambiguous and vague. Therefore, meeting the requirements for ethical approval from ethics committees, while enlisting the CGT methodology as a chosen research approach, can be daunting for novice researchers embarking upon their first CGT study.

  6. Methodology for completing Hanford 200 Area tank waste physical/chemical profile estimations

    International Nuclear Information System (INIS)

    Kruger, A.A.

    1996-01-01

    The purpose of the Methodology for Completing Hanford 200 Area Tank Waste Physical/Chemical Profile Estimations is to capture the logic inherent to completing 200 Area waste tank physical and chemical profile estimates. Since there has been good correlation between the estimate profiles and actual conditions during sampling and sub-segment analysis, it is worthwhile to document the current estimate methodology

  7. CUEX methodology for assessing radiological impacts in the context of ICRP Recommendations

    International Nuclear Information System (INIS)

    Rohwer, P.S.; Kaye, S.V.; Struxness, E.G.

    1975-01-01

    The Cumulative Exposure Index (CUEX) methodology was developed to estimate and assess, in the context of International Commission on Radiological Protection (ICRP) Recommendations, the total radiation dose to man due to environmental releases of radioactivity from nuclear applications. Each CUEX, a time-integrated radionuclide concentration (e.g.μCi.h.cm -3 ), reflects the selected annual dose limit for the reference organ and the estimated total dose to that organ via all exposure modes for a specific exposure situation. To assess the radiological significance of an environmental release of radioactivity, calculated or measured radionuclide concentrations in a suitable environmental sampling medium are compared with CUEXs determined for that medium under comparable conditions. The models and computer codes used in the CUEX methodology to predict environmental transport and to estimate radiation dose have been thoroughly tested. These models and codes are identified and described briefly. Calculation of a CUEX is shown step by step. An application of the methodology to a hypothetical atmospheric release involving four radionuclides illustrates use of the CUEX computer code to assess the radiological significance of a release, and to determine the relative importance (i.e. percentage of the estimated total dose contributed) of each radionuclide and each mode of exposure. The data requirements of the system are shown to be extensive, but not excessive in view of the assessments and analyses provided by the CUEX code. (author)

  8. Solubility measurement of iron-selenium compounds under reducing conditions. Research document

    International Nuclear Information System (INIS)

    Kitamura, Akira; Shibata, Masahiro

    2003-03-01

    Chemical behavior of selenium (Se), which was one of the important elements for performance assessment of geological disposal of high-level radioactive waste, was investigated under reducing and iron-containing conditions. A washing method for an iron diselenide (FeSe 2 (cr)) reagent with acidic and basic solutions (0.1 and 1 M HCl and 1 M NaOH) was carried out for the purification of FeSe 2 reagent, which was considered to be a solubility limiting solid for Se under the geological disposal conditions. Furthermore, solubility of FeSe 2 (cr) was measured in alkaline solution (pH: 11 - 13) under reducing conditions (E h vs SHE: -0.4 - 0 V), and thermodynamic data on equilibrium reactions between Se in solution and Se precipitate were obtained. The dependencies of solubility values on pH and redox potential (E h : vs. standard hydrogen electrode) were best interpreted that the solubility limiting solid was not FeSe 2 (cr) but Se(cr) and the aqueous species was SeO 3 2- in the present experimental conditions. The equilibrium constant between Se(cr) and SeO 3 2- at zero ionic strength was determined and compared with literature values. The chemical behavior of Se under geological disposal conditions was discussed. (author)

  9. EnergiTools. A methodology for performance monitoring and diagnosis

    International Nuclear Information System (INIS)

    Ancion, P.; Bastien, R.; Ringdahl, K.

    2000-01-01

    EnergiTools is a performance monitoring and diagnostic tool that combines the power of on-line process data acquisition with advanced diagnosis methodologies. Analytical models based on thermodynamic principles are combined with neural networks to validate sensor data and to estimate missing or faulty measurements. Advanced diagnostic technologies are then applied to point out potential faults and areas to be investigated further. The diagnosis methodologies are based on Bayesian belief networks. Expert knowledge is captured in the form of the fault-symptom relationships and includes historical information as the likelihood of faults and symptoms. The methodology produces the likelihood of component failure root causes using the expert knowledge base. EnergiTools is used at Ringhals nuclear power plants. It has led to the diagnosis of various performance issues. Three case studies based on this plant data and model are presented and illustrate the diagnosis support methodologies implemented in EnergiTools . In the first case, the analytical data qualification technique points out several faulty measurements. The application of a neural network for the estimation of the nuclear reactor power by interpreting several plant indicators is then illustrated. The use of the Bayesian belief networks is finally described. (author)

  10. Methodologic considerations in the measurement of glycemic index: glycemic response to rye bread, oatmeal porridge, and mashed potato.

    Science.gov (United States)

    Hätönen, Katja A; Similä, Minna E; Virtamo, Jarmo R; Eriksson, Johan G; Hannila, Marja-Leena; Sinkko, Harri K; Sundvall, Jouko E; Mykkänen, Hannu M; Valsta, Liisa M

    2006-11-01

    Methodologic choices affect measures of the glycemic index (GI). The effects on GI values of blood sampling site, reference food type, and the number of repeat tests have been insufficiently determined. The objective was to study the effect of methodologic choices on GI values. Comparisons were made between venous and capillary blood sampling and between glucose and white bread as the reference food. The number of tests needed for the reference food was assessed. Rye bread, oatmeal porridge, and instant mashed potato were used as the test foods. Twelve healthy volunteers were served each test food once and both reference foods 3 times at 1-wk intervals in a random order after they had fasted overnight. Capillary and venous blood samples were drawn at intervals for 3 h after each study meal. GIs and their CVs based on capillary samples were lower than those based on venous samples. Two tests of glucose solution as the reference provided stable capillary GIs for the test foods. The capillary GIs did not differ significantly when white bread was used as the reference 1, 2, or 3 times, but the variation was lower when tests were performed 2 and 3 times. Capillary GIs with white bread as the reference were 1.3 times as high as those with glucose as the reference. The capillary GIs of rye bread, oatmeal porridge, and mashed potato were 77, 74, and 80, respectively, with glucose as the reference. Capillary blood sampling should be used in the measurement of GI, and reference tests with glucose or white bread should be performed at least twice.

  11. LWR design decision methodology. Phase III. Final report

    International Nuclear Information System (INIS)

    Bertucio, R.; Held, J.; Lainoff, S.; Leahy, T.; Prather, W.; Rees, D.; Young, J.

    1982-01-01

    Traditionally, management decisions regarding design options have been made using quantitative cost information and qualitative safety information. A Design Decision Methodology, which utilizes probabilistic risk assessment techniques, including event trees and fault trees, along with systems engineering and standard cost estimation methods, has been developed so that a quantitative safety measure may be obtained as well. The report documents the development of this Design Decision Methodology, a demonstration of the methodology on a current licensing issue with the cooperation of the Washington Public Power Supply System (WPPSS), and a discussion of how the results of the demonstration may be used addressing the various issues associated with a licensing position on the issue

  12. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  13. PIV measurements in a compact return diffuser under multi-conditions

    International Nuclear Information System (INIS)

    Zhou, L; Lu, W G; Shi, W D

    2013-01-01

    Due to the complex three-dimensional geometries of impellers and diffusers, their design is a delicate and difficult task. Slight change could lead to significant changes in hydraulic performance and internal flow structure. Conversely, the grasp of the pump's internal flow pattern could benefit from pump design improvement. The internal flow fields in a compact return diffuser have been investigated experimentally under multi-conditions. A special Particle Image Velocimetry (PIV) test rig is designed, and the two-dimensional PIV measurements are successfully conducted in the diffuser mid-plane to capture the complex flow patterns. The analysis of the obtained results has been focused on the flow structure in diffuser, especially under part-load conditions. The vortex and recirculation flow patterns in diffuser are captured and analysed accordingly. Strong flow separation and back flow appeared at the part-load flow rates. Under the design and over-load conditions, the flow fields in diffuser are uniform, and the flow separation and back flow appear at the part-load flow rates, strong back flow is captured at one diffuser passage under 0.2Q des

  14. Organic and total mercury determination in sediments by cold vapor atomic absorption spectrometry: methodology validation and uncertainty measurements

    Directory of Open Access Journals (Sweden)

    Robson L. Franklin

    2012-01-01

    Full Text Available The purpose of the present study was to validate a method for organic Hg determination in sediment. The procedure for organic Hg was adapted from literature, where the organomercurial compounds were extracted with dichloromethane in acid medium and subsequent destruction of organic compounds by bromine chloride. Total Hg was performed according to 3051A USEPA methodology. Mercury quantification for both methodologies was then performed by CVAAS. Methodology validation was verified by analyzing certified reference materials for total Hg and methylmercury. The uncertainties for both methodologies were calculated. The quantification limit of 3.3 µg kg-1 was found for organic Hg by CVAAS.

  15. Methodologic assessment of radiation epidemiology studies

    International Nuclear Information System (INIS)

    Beebe, G.W.

    1983-01-01

    Epidemiologic studies of the late effects of ionizing radiation have utilized the entire spectrum of situations in which man has been exposed. These studies have provided insights into the dependence of human effects upon not only dose to target tissues but also other dimensions of exposure, host characteristics, and time following exposure. Over the past three decades studies have progressed from the mere identification of effects to their measurement. Because investigators of human effects have no control over the exposure situation, validity must be sought in the consistency of findings among independent studies and with accepted biologic principles. Because exposure may be confounded with factors that are hidden from view, bias may enter into any study of human exposure. Avoidance of bias and attainment of sufficient power to detect relationships that are real are methodologic challenges. Many methodologic issues, e.g., those associated with the definition and measurement of specific end-points, or with the selection of appropriate controls, permeate epidemiologic work in all fields. Others, especially those concerned with the measurement of exposure, the patterning of events in time after exposure, and the prediction of events beyond the scope of existing observations give radiation epidemiology its distinctive character

  16. Continuous culture apparatus and methodology

    International Nuclear Information System (INIS)

    Conway, H.L.

    1975-01-01

    At present, we are investigating the sorption of potentially toxic trace elements by phytoplankton under controlled laboratory conditions. Continuous culture techniques were used to study the mechanism of the sorption of the trace elements by unialgal diatom populations and the factors influencing this sorption. Continuous culture methodology has been used extensively to study bacterial kinetics. It is an excellent technique for obtaining a known physiological state of phytoplankton populations. An automated method for the synthesis of continuous culture medium for use in these experiments is described

  17. Measurement crankshaft angular speed of an OM403 engine

    Directory of Open Access Journals (Sweden)

    Biočanin Stojko

    2017-01-01

    Full Text Available In this paper, the methodology of the measurement of the angular speed of the crankshaft of a ten-cylinder diesel OM403 engine is presented, with regular and irregular engine operation. The angular velocity was measured under laboratory conditions, on already installed measuring equipment from the laboratory and on the break of a well known brand-Schenck, by using an optoelectronic incremental rotary encoder, a data acquisition module and the LabVIEW software for synchronization and management of the measuring equipment. The goal of this paper is to give a practical contribution to researches of measuring of crankshaft angular speed of the crankshaft engine OM 403.

  18. Atmospheric conditions measured by a wireless sensor network on the local scale

    Science.gov (United States)

    Lengfeld, K.; Ament, F.

    2010-09-01

    Atmospheric conditions close to the surface, like temperature, wind speed and humidity, vary on small scales because of surface heterogeneities. Therefore, the traditional measuring approach of using a single, highly accurate station is of limited representativeness for a larger domain, because it is not able to determine these small scale variabilities. However, both the variability and the domain averages are important information for the development and validation of atmospheric models and soil-vegetation-atmosphere-transfer (SVAT) schemes. Due to progress in microelectronics it is possible to construct networks of comparably cheap meteorological stations with moderate accuracy. Such a network provides data in high spatial and temporal resolution. The EPFL Lausanne developed such a network called SensorScope, consisting of low cost autonomous stations. Each station observes air and surface temperature, humidity, wind direction and speed, incoming solar radiation, precipitation, soil moisture and soil temperature and sends the data via radio communication to a base station. This base station forwards the collected data via GSM/GPRS to a central server. The first measuring campaign took place within the FLUXPAT project in August 2009. We deployed 15 stations as a twin transect near Jülich, Germany. To test the quality of the low cost sensors we compared two of them to more accurate reference systems. It turned out, that although the network sensors are not highly accurate, the measurements are consistent. Consequently an analysis of the pattern of atmospheric conditions is feasible. The transect is 2.3 km long and covers different types of vegetation and a small river. Therefore, we analyse the influence of different land surfaces and the distance to the river on meteorological conditions. For example, we found a difference in air temperature of 0.8°C between the station closest to and the station farthest from the river. The decreasing relative humidity with

  19. Development of risk assessment methodology against natural external hazards for sodium-cooled fast reactors: project overview and strong Wind PRA methodology - 15031

    International Nuclear Information System (INIS)

    Yamano, H.; Nishino, H.; Kurisaka, K.; Okano, Y.; Sakai, T.; Yamamoto, T.; Ishizuka, Y.; Geshi, N.; Furukawa, R.; Nanayama, F.; Takata, T.; Azuma, E.

    2015-01-01

    This paper describes mainly strong wind probabilistic risk assessment (PRA) methodology development in addition to the project overview. In this project, to date, the PRA methodologies against snow, tornado and strong wind were developed as well as the hazard evaluation methodologies. For the volcanic eruption hazard, ash fallout simulation was carried out to contribute to the development of the hazard evaluation methodology. For the forest fire hazard, the concept of the hazard evaluation methodology was developed based on fire simulation. Event sequence assessment methodology was also developed based on plant dynamics analysis coupled with continuous Markov chain Monte Carlo method in order to apply to the event sequence against snow. In developing the strong wind PRA methodology, hazard curves were estimated by using Weibull and Gumbel distributions based on weather data recorded in Japan. The obtained hazard curves were divided into five discrete categories for event tree quantification. Next, failure probabilities for decay heat removal related components were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or out-take in the decay heat removal system, and fragility caused by the missile impacts. Finally, based on the event tree, the core damage frequency was estimated about 6*10 -9 /year by multiplying the discrete hazard probabilities in the Gumbel distribution by the conditional decay heat removal failure probabilities. A dominant sequence was led by the assumption that the operators could not extinguish fuel tank fire caused by the missile impacts and the fire induced loss of the decay heat removal system. (authors)

  20. Distant Measurement of Plethysmographic Signal in Various Lighting Conditions Using Configurable Frame-Rate Camera

    Directory of Open Access Journals (Sweden)

    Przybyło Jaromir

    2016-12-01

    Full Text Available Videoplethysmography is currently recognized as a promising noninvasive heart rate measurement method advantageous for ubiquitous monitoring of humans in natural living conditions. Although the method is considered for application in several areas including telemedicine, sports and assisted living, its dependence on lighting conditions and camera performance is still not investigated enough. In this paper we report on research of various image acquisition aspects including the lighting spectrum, frame rate and compression. In the experimental part, we recorded five video sequences in various lighting conditions (fluorescent artificial light, dim daylight, infrared light, incandescent light bulb using a programmable frame rate camera and a pulse oximeter as the reference. For a video sequence-based heart rate measurement we implemented a pulse detection algorithm based on the power spectral density, estimated using Welch’s technique. The results showed that lighting conditions and selected video camera settings including compression and the sampling frequency influence the heart rate detection accuracy. The average heart rate error also varies from 0.35 beats per minute (bpm for fluorescent light to 6.6 bpm for dim daylight.

  1. Thermodynamic analysis of multicomponent distillation columns: identifying optimal feed conditions

    Directory of Open Access Journals (Sweden)

    M. L.O. Maia

    2000-12-01

    Full Text Available A new methodology for the optimisation of feed conditions as well as the calculation of minimum reflux ratio of distillation columns is presented. The reversible profile approach used for saturated liquid feeds is extended to consider other feed conditions. For flashed feed, the liquid fraction of the feed stream is used to compute the column pinch conditions and the minimum reflux ratio. The modifications required for subcooled liquid and superheated vapor feed are discussed, and a procedure to estimate the minimum reflux for those conditions is proposed. The methodology presented allows the identification of the optimal feed condition, without having to resort to a full stage-by-stage procedure.

  2. Thermal-Diffusivity Measurements of Mexican Citrus Essential Oils Using Photoacoustic Methodology in the Transmission Configuration

    Science.gov (United States)

    Muñoz, G. A. López; González, R. F. López; López, J. A. Balderas; Martínez-Pérez, L.

    2011-05-01

    Photoacoustic methodology in the transmission configuration (PMTC) was used to study the thermophysical properties and their relation with the composition in Mexican citrus essential oils providing the viability of using photothermal techniques for quality control and for authentication of oils and their adulteration. Linear relations for the amplitude (on a semi-log scale) and phase, as functions of the sample's thickness, for the PMTC was obtained through a theoretical model fit to the experimental data for thermal-diffusivity measurements in Mexican orange, pink grapefruit, mandarin, lime type A, centrifuged essential oils, and Mexican distilled lime essential oil. Gas chromatography for distilled lime essential oil and centrifuged lime essential oil type A is reported to complement the study. Experimental results showed close thermal-diffusivity values between Mexican citrus essential oils obtained by centrifugation, but a significant difference of this physical property for distilled lime oil and the corresponding value obtained by centrifugation, which is due to their different chemical compositions involved with the extraction processes.

  3. Effects of drop size and measuring condition on static contact angle measurement on a superhydrophobic surface with goniometric technique

    International Nuclear Information System (INIS)

    Seo, Kwangseok; Kim, Minyoung; Kim, Do Hyun; Ahn, Jeong Keun

    2015-01-01

    It is not a simple task to measure a contact angle of a water drop on a superhydrophobic surface with sessile drop method, because a roll-off angle is very low. Usually contact angle of a water drop on a superhydrophobic surface is measured by fixing a drop with intentional defects on the surface or a needle. We examined the effects of drop size and measuring condition such as the use of a needle or defects on the static contact angle measurement on superhydrophobic surface. Results showed that the contact angles on a superhydrophobic surface remain almost constant within intrinsic measurement errors unless there is a wetting transition during the measurement. We expect that this study will provide a deeper understanding on the nature of the contact angle and convenient measurement of the contact angle on the superhydrophobic surface.

  4. Cryogenic Insulation Standard Data and Methodologies Project

    Science.gov (United States)

    Summerfield, Burton; Thompson, Karen; Zeitlin, Nancy; Mullenix, Pamela; Fesmire, James; Swanger, Adam

    2015-01-01

    Extending some recent developments in the area of technical consensus standards for cryogenic thermal insulation systems, a preliminary Inter-Laboratory Study of foam insulation materials was performed by NASA Kennedy Space Center and LeTourneau University. The initial focus was ambient pressure cryogenic boil off testing using the Cryostat-400 flat-plate instrument. Completion of a test facility at LETU has enabled direct, comparative testing, using identical cryostat instruments and methods, and the production of standard thermal data sets for a number of materials under sub-ambient conditions. The two sets of measurements were analyzed and indicate there is reasonable agreement between the two laboratories. Based on cryogenic boiloff calorimetry, new equipment and methods for testing thermal insulation systems have been successfully developed. These boiloff instruments (or cryostats) include both flat plate and cylindrical models and are applicable to a wide range of different materials under a wide range of test conditions. Test measurements are generally made at large temperature difference (boundary temperatures of 293 K and 78 K are typical) and include the full vacuum pressure range. Results are generally reported in effective thermal conductivity (ke) and mean heat flux (q) through the insulation system. The new cryostat instruments provide an effective and reliable way to characterize the thermal performance of materials under subambient conditions. Proven in through thousands of tests of hundreds of material systems, they have supported a wide range of aerospace, industry, and research projects. Boiloff testing technology is not just for cryogenic testing but is a cost effective, field-representative methodology to test any material or system for applications at sub-ambient temperatures. This technology, when adequately coupled with a technical standards basis, can provide a cost-effective, field-representative methodology to test any material or system

  5. Classicality condition on a system observable in a quantum measurement and a relative-entropy conservation law

    Science.gov (United States)

    Kuramochi, Yui; Ueda, Masahito

    2015-03-01

    We consider the information flow on a system observable X corresponding to a positive-operator-valued measure under a quantum measurement process Y described by a completely positive instrument from the viewpoint of the relative entropy. We establish a sufficient condition for the relative-entropy conservation law which states that the average decrease in the relative entropy of the system observable X equals the relative entropy of the measurement outcome of Y , i.e., the information gain due to measurement. This sufficient condition is interpreted as an assumption of classicality in the sense that there exists a sufficient statistic in a joint successive measurement of Y followed by X such that the probability distribution of the statistic coincides with that of a single measurement of X for the premeasurement state. We show that in the case when X is a discrete projection-valued measure and Y is discrete, the classicality condition is equivalent to the relative-entropy conservation for arbitrary states. The general theory on the relative-entropy conservation is applied to typical quantum measurement models, namely, quantum nondemolition measurement, destructive sharp measurements on two-level systems, a photon counting, a quantum counting, homodyne and heterodyne measurements. These examples except for the nondemolition and photon-counting measurements do not satisfy the known Shannon-entropy conservation law proposed by Ban [M. Ban, J. Phys. A: Math. Gen. 32, 1643 (1999), 10.1088/0305-4470/32/9/012], implying that our approach based on the relative entropy is applicable to a wider class of quantum measurements.

  6. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  7. A stochastic optimal feedforward and feedback control methodology for superagility

    Science.gov (United States)

    Halyo, Nesim; Direskeneli, Haldun; Taylor, Deborah B.

    1992-01-01

    A new control design methodology is developed: Stochastic Optimal Feedforward and Feedback Technology (SOFFT). Traditional design techniques optimize a single cost function (which expresses the design objectives) to obtain both the feedforward and feedback control laws. This approach places conflicting demands on the control law such as fast tracking versus noise atttenuation/disturbance rejection. In the SOFFT approach, two cost functions are defined. The feedforward control law is designed to optimize one cost function, the feedback optimizes the other. By separating the design objectives and decoupling the feedforward and feedback design processes, both objectives can be achieved fully. A new measure of command tracking performance, Z-plots, is also developed. By analyzing these plots at off-nominal conditions, the sensitivity or robustness of the system in tracking commands can be predicted. Z-plots provide an important tool for designing robust control systems. The Variable-Gain SOFFT methodology was used to design a flight control system for the F/A-18 aircraft. It is shown that SOFFT can be used to expand the operating regime and provide greater performance (flying/handling qualities) throughout the extended flight regime. This work was performed under the NASA SBIR program. ICS plans to market the software developed as a new module in its commercial CACSD software package: ACET.

  8. Decision for counting condition of radioactive waste activities measuring by Ludlum detector

    International Nuclear Information System (INIS)

    Bambang-Purwanto

    2000-01-01

    Radioactive waste must measured for activities before be throw out to environment. Measuring will be important in ordered to know activities can be given management direction. For activities radioactive waste on limit threshold value must processed, but for under limit threshold value activities can be throw out to environment. Activities measuring for solid radioactive waste and liquid by (Total, β, γ) Ludlum detector connected Mode-1000 Scaler Counting. Before measuring for solid waste activities was decisioned optimally counting condition, and be obtained are : sample weight 3.5 gram, heating temperature of 125 o C and heating time at 60 minutes. Activities measuring result by total detector ranges from (0.68-0.71) 10 -1 μCi/gram, β detector ranges from (0.24-0.25) 10 -1 μCi/gram and γ detector ranges from (0.35-0.37) μCi/gram

  9. Thermoregulatory responses in exercising rats: methodological aspects and relevance to human physiology.

    Science.gov (United States)

    Wanner, Samuel Penna; Prímola-Gomes, Thales Nicolau; Pires, Washington; Guimarães, Juliana Bohnen; Hudson, Alexandre Sérvulo Ribeiro; Kunstetter, Ana Cançado; Fonseca, Cletiana Gonçalves; Drummond, Lucas Rios; Damasceno, William Coutinho; Teixeira-Coelho, Francisco

    2015-01-01

    Rats are used worldwide in experiments that aim to investigate the physiological responses induced by a physical exercise session. Changes in body temperature regulation, which may affect both the performance and the health of exercising rats, are evident among these physiological responses. Despite the universal use of rats in biomedical research involving exercise, investigators often overlook important methodological issues that hamper the accurate measurement of clear thermoregulatory responses. Moreover, much debate exists regarding whether the outcome of rat experiments can be extrapolated to human physiology, including thermal physiology. Herein, we described the impact of different exercise intensities, durations and protocols and environmental conditions on running-induced thermoregulatory changes. We focused on treadmill running because this type of exercise allows for precise control of the exercise intensity and the measurement of autonomic thermoeffectors associated with heat production and loss. Some methodological issues regarding rat experiments, such as the sites for body temperature measurements and the time of day at which experiments are performed, were also discussed. In addition, we analyzed the influence of a high body surface area-to-mass ratio and limited evaporative cooling on the exercise-induced thermoregulatory responses of running rats and then compared these responses in rats to those observed in humans. Collectively, the data presented in this review represent a reference source for investigators interested in studying exercise thermoregulation in rats. In addition, the present data indicate that the thermoregulatory responses of exercising rats can be extrapolated, with some important limitations, to human thermal physiology.

  10. Atmospheric aerosol in an urban area: Comparison of measurement instruments and methodologies and pulmonary deposition assessment; Aerosol atmosferico in area urbanae di misura e valutazione di deposizione polmonare

    Energy Technology Data Exchange (ETDEWEB)

    Berico, M; Luciani, A; Formignani, M [ENEA, Centro Ricerche Bologna (Italy). Dip. Ambiente

    1996-07-01

    In March 1995 a measurement campaign of atmospheric aerosol in the Bologna urban area (Italy) was carried out. A transportable laboratory, set up by ENEA (Italian national Agency for New Technologies, Energy and the Environment) Environmental Department (Bologna), was utilized with instruments for measurement of atmospheric aerosol and meteorological parameters. The aim of this campaign was of dual purpose: to characterize aerosol in urban area and to compare different instruments and methodologies of measurements. Mass concentrations measurements, evaluated on a 23-hour period with total filter, PM10 dichotomous sampler and low pressure impactor (LPI Berner), have provided information respectively about total suspended particles, respirable fraction and granulometric parameters of aerosol. Eight meteorologic parameters, number concentration of submicromic fraction of aerosol and mass concentration of micromic fraction have been continually measured. Then, in a daytime period, several number granulometries of atmospheric aerosol have also been estimated by means of diffusion battery system. Results related to different measurement methodologies and granulometric characteristics of aerosol are presented here. Pulmonary deposition of atmospheric aerosol is finally calculated, using granulometries provided by LPI Brener and ICRP 66 human respiratory tract model.

  11. Evaluation methodology for fixed-site physical protection systems

    International Nuclear Information System (INIS)

    Bennett, H.A.; Olascoaga, M.T.

    1980-01-01

    A system performance evaluation methodology has been developed to aid the Nuclear Regulatory Commission (NRC) in the implementation of new regulations designed to upgrade the physical protection of nuclear fuel cycle facilities. The evaluation methodology, called Safeguards Upgrade Rule Evaluation (SURE), provides a means of explicitly incorporating measures for highly important and often difficult to quantify performance factors, e.g., installation, maintenance, training and proficiency levels, compatibility of components in subsystems, etc. This is achieved by aggregating responses to component and system questionaires through successive levels of a functional hierarchy developed for each primary performance capability specified in the regulations, 10 CFR 73.45. An overall measure of performance for each capability is the result of this aggregation process. This paper provides a descripton of SURE

  12. Methodology for astronaut reconditioning research.

    Science.gov (United States)

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Using Indigenist and Indigenous methodologies to connect to deeper understandings of Aboriginal and Torres Strait Islander peoples' quality of life.

    Science.gov (United States)

    Kite, Elaine; Davy, Carol

    2015-12-01

    The lack of a common description makes measuring the concept of quality of life (QoL) a challenge. Whether QoL incorporates broader social features or is attributed to health conditions, the diverse range of descriptions applied by various disciplines has resulted in a concept that is multidimensional and vague. The variety of theoretical conceptualisations of QoL confounds and confuses even the most astute. Measuring QoL in Aboriginal and Torres Strait Islander populations is even more challenging. Instruments commonly developed and used to measure QoL are often derived from research methodologies shaped by Western cultural perspectives. Often they are simply translated for use among culturally and linguistically diverse Aboriginal and Torres Strait Islander peoples. This has implications for Aboriginal and Torres Strait Islander populations whose perceptions of health are derived from within their specific cultures, value systems and ways of knowing and being. Interconnections and relationships between themselves, their communities, their environment and the natural and spiritual worlds are complex. The way in which their QoL is currently measured indicates that very little attention is given to the diversity of Aboriginal and Torres Strait Islander peoples' beliefs or the ways in which those beliefs shape or give structure and meaning to their health and their lives. The use of Indigenist or Indigenous methodologies in defining what Aboriginal and Torres Strait Islander peoples believe gives quality to their lives is imperative. These methodologies have the potential to increase the congruency between their perceptions of QoL and instruments to measure it.

  14. Thermal decomposition of hydroxylamine: Isoperibolic calorimetric measurements at different conditions

    International Nuclear Information System (INIS)

    Adamopoulou, Theodora; Papadaki, Maria I.; Kounalakis, Manolis; Vazquez-Carreto, Victor; Pineda-Solano, Alba; Wang, Qingsheng; Mannan, M.Sam

    2013-01-01

    Highlights: • Hydroxylamine thermal decomposition enthalpy was measured using larger quantities. • The rate at which heat is evolved depends on hydroxylamine concentration. • Decomposition heat is strongly affected by the conditions and the selected baseline. • The need for enthalpy measurements using a larger reactant mass is pinpointed. • Hydroxylamine decomposition in the presence of argon is much faster than in air. -- Abstract: Thermal decomposition of hydroxylamine, NH 2 OH, was responsible for two serious accidents. However, its reactive behavior and the synergy of factors affecting its decomposition are not being understood. In this work, the global enthalpy of hydroxylamine decomposition has been measured in the temperature range of 130–150 °C employing isoperibolic calorimetry. Measurements were performed in a metal reactor, employing 30–80 ml solutions containing 1.4–20 g of pure hydroxylamine (2.8–40 g of the supplied reagent). The measurements showed that increased concentration or temperature, results in higher global enthalpies of reaction per unit mass of reactant. At 150 °C, specific enthalpies as high as 8 kJ per gram of hydroxylamine were measured, although in general they were in the range of 3−5 kJ g −1 . The accurate measurement of the generated heat was proven to be a cumbersome task as (a) it is difficult to identify the end of decomposition, which after a fast initial stage, proceeds very slowly, especially at lower temperatures and (b) the environment of gases affects the reaction rate

  15. Thermal decomposition of hydroxylamine: Isoperibolic calorimetric measurements at different conditions

    Energy Technology Data Exchange (ETDEWEB)

    Adamopoulou, Theodora [Department of Environmental and Natural Resources Management, University of Western Greece (formerly of University of Ioannina), Seferi 2, Agrinio GR30100 (Greece); Papadaki, Maria I., E-mail: mpapadak@cc.uoi.gr [Department of Environmental and Natural Resources Management, University of Western Greece (formerly of University of Ioannina), Seferi 2, Agrinio GR30100 (Greece); Kounalakis, Manolis [Department of Environmental and Natural Resources Management, University of Western Greece (formerly of University of Ioannina), Seferi 2, Agrinio GR30100 (Greece); Vazquez-Carreto, Victor; Pineda-Solano, Alba [Mary Kay O’Connor Process Safety Center, Artie McFerrin Department of Chemical Engineering, Texas A and M University, College Station, TX 77843 (United States); Wang, Qingsheng [Department of Fire Protection and Safety and Department of Chemical Engineering, Oklahoma State University, 494 Cordell South, Stillwater, OK 74078 (United States); Mannan, M.Sam [Mary Kay O’Connor Process Safety Center, Artie McFerrin Department of Chemical Engineering, Texas A and M University, College Station, TX 77843 (United States)

    2013-06-15

    Highlights: • Hydroxylamine thermal decomposition enthalpy was measured using larger quantities. • The rate at which heat is evolved depends on hydroxylamine concentration. • Decomposition heat is strongly affected by the conditions and the selected baseline. • The need for enthalpy measurements using a larger reactant mass is pinpointed. • Hydroxylamine decomposition in the presence of argon is much faster than in air. -- Abstract: Thermal decomposition of hydroxylamine, NH{sub 2}OH, was responsible for two serious accidents. However, its reactive behavior and the synergy of factors affecting its decomposition are not being understood. In this work, the global enthalpy of hydroxylamine decomposition has been measured in the temperature range of 130–150 °C employing isoperibolic calorimetry. Measurements were performed in a metal reactor, employing 30–80 ml solutions containing 1.4–20 g of pure hydroxylamine (2.8–40 g of the supplied reagent). The measurements showed that increased concentration or temperature, results in higher global enthalpies of reaction per unit mass of reactant. At 150 °C, specific enthalpies as high as 8 kJ per gram of hydroxylamine were measured, although in general they were in the range of 3−5 kJ g{sup −1}. The accurate measurement of the generated heat was proven to be a cumbersome task as (a) it is difficult to identify the end of decomposition, which after a fast initial stage, proceeds very slowly, especially at lower temperatures and (b) the environment of gases affects the reaction rate.

  16. Issues Related to Measuring and Interpreting Objectively Measured Sedentary Behavior Data

    Science.gov (United States)

    Janssen, Xanne; Cliff, Dylan P.

    2015-01-01

    The use of objective measures of sedentary behavior has increased over the past decade; however, as is the case for objectively measured physical activity, methodological decisions before and after data collection are likely to influence the outcomes. The aim of this article is to review the evidence on different methodological decisions made by…

  17. An experimental methodology to quantify the spray cooling event at intermittent spray impact

    International Nuclear Information System (INIS)

    Moreira, Antonio L.N.; Carvalho, Joao; Panao, Miguel R.O.

    2007-01-01

    The present paper describes an experimental methodology devised to study spray cooling with multiple-intermittent sprays as those found in fuel injection systems of spark-ignition and diesel engines, or in dermatologic surgery applications. The spray characteristics and the surface thermal behaviour are measured by combining a two-component phase-Doppler anemometer with fast response surface thermocouples. The hardware allows simultaneous acquisition of Doppler and thermocouple signals which are processed in Matlab to estimate the time-varying heat flux and fluid-dynamic characteristics of the spray during impact. The time resolution of the acquisition system is limited by the data rate of validation of the phase-Doppler anemometer, but it has been shown to be accurate for the characterization of spray-cooling processes with short spurt durations for which the transient period of spray injection plays an important role. The measurements are processed in terms of the instantaneous heat fluxes, from which phase-average values of the boiling curves are obtained. Two of the characteristic parameters used in the thermal analysis of stationary spray cooling events, the critical heat flux (CHF) and Leidenfrost phenomenon, are then inferred in terms of operating conditions of the multiple-intermittent injections, such as the frequency, duration and pressure of injection. An integral method is suggested to describe the overall process of heat transfer, which accounts for the fluid-dynamic heterogeneities induced by multiple and successive droplet interactions within the area of spray impact. The method considers overall boiling curves dependant on the injection conditions and provides an empirical tool to characterize the heat transfer processes on the impact of multiple-intermittent sprays. The methodology is tested in a preliminary study of the effect of injection conditions on the heat removed by a fuel spray striking the back surface of the intake valve as in spark

  18. Adsorptive removal of residual catalyst from palm biodiesel: Application of response surface methodology

    Directory of Open Access Journals (Sweden)

    Mjalli Sabri Farouq

    2012-01-01

    Full Text Available In this work, the residual potassium hydroxide catalyst was removed from palm oil-based methyl esters using an adsorption technique. The produced biodiesel was initially purified through a water washing process. To produce a biodiesel with a better quality and also to meet standard specifications (EN 14214 and ASTM D6751, batch adsorption on palm shell activated carbon was used for further catalyst removal. The Central Composite Design (CCD of the Response Surface Methodology (RSM was used to study the influence of adsorbent amount, time and temperature on the adsorption of potassium species. The maximum catalyst removal was achieved at 40°C using 0.9 g activated carbon for 20 h adsorption time. The results from the Response Surface Methodology are in a good agreement with the measured values. The absolute error in prediction at the optimum condition was 3.7%, which is reasonably accurate. This study proves that adsorption post-treatment techniques can be successfully employed to improve the quality of biodiesel fuel for its effective use on diesel engines and to minimize the usage of water.

  19. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (theoretical and methodological foundations of vocational teacher education

    Directory of Open Access Journals (Sweden)

    Evgeny M. Dorozhkin

    2014-01-01

    methodology taking into consideration the target orientation, principles and approaches to the organization and its’ methods of scientific and educational activities implementation. The qualification structure formation of the teachers’ vocational training and providing advance principles of education are considered to be the most important conditions for the development of vocational teacher education. Scientific novelty. The research demonstrates creating the project of further vocational teacher education development in the post-industrial society. The pedagogical innovations transforming research findings into educational practice are considered to be the main tool of integration methodology means. Practical significance. The research findings highlight the proposed reforms for further teachers training system development of vocational institutes, which are in need of drastic restructuring. In the final part of the article the authors recommend some specific issues that can be discussed at the methodological workshop. 

  20. Energy efficiency index to artificially conditioned buildings; Indice de eficiencia energetica para edificios climatizados artificialmente

    Energy Technology Data Exchange (ETDEWEB)

    Jota, Patricia Romeiro da Silva; Santos, Carla da Silva; Costa, Kelly Luciene C. [Centro Federal de Educacao Tecnologica de Minas Gerais (CEMIG/CEFET), Belo Horizonte, MG (Brazil). Centro de Pesquisa em Energia Inteligente

    2010-07-01

    Conditioning buildings has been growing in number and are responsible for a significant portion of the energy used worldwide. The building energy use can be measured by the index of energy performance and specific fuel consumption (EC). The specific consumption is an index where the energy is normalized by the factors that affect energy use in order to obtain an index to explain variations in consumption. In this paper, we present a methodology to obtain a specific consumption that takes into account one of the factors that most affect energy use in these buildings, that is, the external temperature. The study is based on analysis of consumption of air conditioning system according to temperature. Through this analysis we obtain a function to facilitate the standardization of energy use, depending on the temperature outside. This methodology was tested in previous work on real buildings without stratification of energy, and this work will be presented a case study of a building whose energy measurement is stratified. The proposed index is the ratio between the energy consumption of air conditioning system corrected by the temperature through the function K(T). It was possible to demonstrate the efficiency of the index to eliminate the effect of temperature and thus to evaluate the evolution of specific consumption over the months analyzed. (author)

  1. A Statistical Methodology for Determination of Safety Systems Actuation Setpoints Based on Extreme Value Statistics

    Directory of Open Access Journals (Sweden)

    D. R. Novog

    2008-01-01

    Full Text Available This paper provides a novel and robust methodology for determination of nuclear reactor trip setpoints which accounts for uncertainties in input parameters and models, as well as accounting for the variations in operating states that periodically occur. Further it demonstrates that in performing best estimate and uncertainty calculations, it is critical to consider the impact of all fuel channels and instrumentation in the integration of these uncertainties in setpoint determination. This methodology is based on the concept of a true trip setpoint, which is the reactor setpoint that would be required in an ideal situation where all key inputs and plant responses were known, such that during the accident sequence a reactor shutdown will occur which just prevents the acceptance criteria from being exceeded. Since this true value cannot be established, the uncertainties in plant simulations and plant measurements as well as operational variations which lead to time changes in the true value of initial conditions must be considered. This paper presents the general concept used to determine the actuation setpoints considering the uncertainties and changes in initial conditions, and allowing for safety systems instrumentation redundancy. The results demonstrate unique statistical behavior with respect to both fuel and instrumentation uncertainties which has not previously been investigated.

  2. Chronopolitics: methodological aspects of public policy research

    Directory of Open Access Journals (Sweden)

    O. A. Zubchyk

    2016-08-01

    Chronopolitics as methodology examines the role of the state in the political structure of the political entity in temporal conditions of political and administrative decisions. These issues have been discussed in the context of Chronopolitical study of historical forms of political organization. The study has proved that Chronopolitics functionally and structurally adds the conceptual and categorical apparatus of political sciences, science and public administration.

  3. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Infrared spectroscopy of pollen identifies plant species and genus as well as environmental conditions.

    Directory of Open Access Journals (Sweden)

    Boris Zimmermann

    Full Text Available BACKGROUND: It is imperative to have reliable and timely methodologies for analysis and monitoring of seed plants in order to determine climate-related plant processes. Moreover, impact of environment on plant fitness is predominantly based on studies of female functions, while the contribution of male gametophytes is mostly ignored due to missing data on pollen quality. We explored the use of infrared spectroscopy of pollen for an inexpensive and rapid characterization of plants. METHODOLOGY: The study was based on measurement of pollen samples by two Fourier transform infrared techniques: single reflectance attenuated total reflectance and transmission measurement of sample pellets. The experimental set, with a total of 813 samples, included five pollination seasons and 300 different plant species belonging to all principal spermatophyte clades (conifers, monocotyledons, eudicots, and magnoliids. RESULTS: The spectroscopic-based methodology enables detection of phylogenetic variations, including the separation of confamiliar and congeneric species. Furthermore, the methodology enables measurement of phenotypic plasticity by the detection of inter-annual variations within the populations. The spectral differences related to environment and taxonomy are interpreted biochemically, specifically variations of pollen lipids, proteins, carbohydrates, and sporopollenins. The study shows large variations of absolute content of nutrients for congenital species pollinating in the same environmental conditions. Moreover, clear correlation between carbohydrate-to-protein ratio and pollination strategy has been detected. Infrared spectral database with respect to biochemical variation among the range of species, climate and biogeography will significantly improve comprehension of plant-environment interactions, including impact of global climate change on plant communities.

  5. A methodology for the optimization of the estimation of tritium in urine by liquid scintillation counting

    International Nuclear Information System (INIS)

    Joseph, S.; Kramer, G.H.

    1982-10-01

    A method has been designed to optimize liquid scintillation (LS) urinalysis with respect to sensitivity and cost. Three related factors, quench, sample composition and counting efficiency, were measured simultaneously and the results plotted in three dimensions to determine the optimum conditions for urinalysis. Picric acid was used to simulate quenching. Subsequent urinalysis experiments showed that quenching by picric acid was analogous to urine quenching. The optimization methodology was applied to ten commercial LS cocktails and a wide divergence in results was obtained. This method can also be used to optimize minimum detectable activities (MDA) but the results show that there is no fixed sample composition that can be used for all the various types of urine samples; however, it is possible to achieve general improvements of at least a factor of 2 in the MDA for Scintiverse (the only one tested for this particular application of the methodology)

  6. Measurements of void fraction in a heated tube in the rewetting conditions

    International Nuclear Information System (INIS)

    Freitas, R.L.

    1983-01-01

    The methods of void fraction measurements by transmission and diffusion of cold, thermal and epithermal neutrons were studied with cylindrical alluminium pieces simulating the steam. A great set of void fraction found in a wet zone was examined and a particulsar attention was given to the sensitivity effects of the method, mainly for high void fraction. Several aspects of the measurement techniques were analyzed, such as the effect of the phase radial distribution, neutron energy, water tempeture, effect of the void axial gradient. The technique of thermal neutron diffusion measurement was used to measure the axial profile of void fraction in a steady two-phase flow, where the pressure, mass velocity and heat flux are representative of the wet conditions. Experimental results are presented and compared with different void fraction models. (E.G.) [pt

  7. Environmental sample banking-research and methodology

    International Nuclear Information System (INIS)

    Becker, D.A.

    1976-01-01

    The National Bureau of Standards (NBS), in cooperation with the Environment Protection Agency and the National Science Foundation, is engaged in a research program establishing methodology for environmental sample banking. This program is aimed toward evaluating the feasibility of a National Environment Specimen Bank (NESB). The capability for retrospective chemical analyses to evaluate changes in our environment would provide useful information. Much of this information could not be obtained using data from previously analyzed samples. However, to assure validity for these stored samples, they must be sampled, processed and stored under rigorously evaluated, controlled and documented conditions. The program currently under way in the NBS Analytical Chemistry Division has 3 main components. The first is an extension survey of available literature concerning problems of contamination, losses and storage. The components of interest include trace elements, pesticides, other trace organics (PCBs, plasticizers, etc.), radionuclides and microbiological species. The second component is an experimental evaluation of contamination and losses during sampling and sample handling. Of particular interest here is research into container cleaning methodology for trace elements, with respect to adsorption, desorption, leaching and partial dissolution by various sample matrices. The third component of this program is an evaluation of existing methodology for long-term sample storage

  8. Effect of boundary conditions on measured water retention behavior within soils

    Science.gov (United States)

    Galindo-torres, S.; Scheuermann, A.; Pedroso, D.; Li, L.

    2013-12-01

    The Soil Water Characteristic Curve (SWCC) is a practical representation of the behavior of soil water by relating the suction (difference between the air and water pressures to the moisture content (water saturation). The SWCC is characterized by a hysteresis loop, which is thought to be unique in that any drainage-imbibition cycle lies within a main hysteresis loop limited by two different curves for drainage and imbibition. This 'uniqueness' is the main argument for considering the SWCC as a material-intrinsic feature that characterizes the pore structure and its interaction with fluids. Models have been developed with the SWCC as input data to describe the evolution of the water saturation and the suction within soils. One example of these models is the widely used Richard's equation [1]. In this work we present a series of numerical simulations to evaluate the 'unique' nature of the SWCC. The simulations involves the use of the Lattice Boltzmann Method (LBM) [2] within a regular soil, modelling the flow behavior of two immiscible fluids: wetting and non-wetting. The soil is packed within a cubic domain to resemble the experimental setups that are commonly used for measuring the SWCC[3]. The boundary conditions ensure that the non-wetting phase enters through one cubic face and the wetting phase enters trough the opposite phase, with no flow boundary conditions in the remaining 4 cubic faces. The SWCC known features are inspected including the presence of the common limit curves for different cycles involving varying limits for the suction. For this stage of simulations, the SWCC is indeed unique. Later, different boundary conditions are applied with the two fluids each injected from 3 opposing faces into the porous medium. The effect of this boundary condition change is a net flow direction, which is different from that in the previous case. A striking result is observed when both SWCC are compared and found to be noticeable different. Further analysis is

  9. Comparison of ventilation measurement techniques in real conditions

    International Nuclear Information System (INIS)

    Jilek, K.; Tomasek, L.

    2001-01-01

    Ventilation and radon entry rate are the only two quantities that influence on indoor radon behaviour. In order to investigate the effect of ventilation and radon entry rate on indoor radon behaviour separately , the Institute was equipped with continuous monitor of carbon monoxide (CO). Carbon monoxide serves as a tracer gas for the determination of air exchange rate. The use of a continuous radon monitor and the continuous monitor of CO gas at the same time enables to measure the radon entry rate and the air exchange rate separately. In the lecture are summarized results of comparison of the following three basic methods performed in real living conditions: - constant decay method; - constant tracer method; and steady rate of tracer injection to determine the air exchange rate for 222 Rn and CO gas, which were used as tracer gases. (authors)

  10. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    International Nuclear Information System (INIS)

    Knezevic, J.; Odoom, E.R.

    2001-01-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets

  11. The Efficiency of Repressive Anti-Corruption Measures in Conditions of High-Level Corruption

    Directory of Open Access Journals (Sweden)

    Abramov Fedir V.

    2017-12-01

    Full Text Available The article is aimed at determining the efficiency of repressive anti-corruption measures in conditions of high-level corruption. It is shown that the formal rules regulating the use of repressive methods of countering corruption are characterized by a significant level of the target inefficiency of formal rules. Resulting from ignorance as to the causes of both occurence and spread of corruption – the inefficiency of the current formal rules – repressive anti-corruption measures are fundamentally incapable of achieving a significant reduction in the level of corruptness. It has been proved that, in addition to significant target inefficiency, repressive anti-corruption methods can potentially lead to increased levels of corruption because of abusing by supervisory officials of their official duties and the spread of internal corruption within anti-corruption structures. The potential threats from the uncontrolled anti-corruption structures towards other controlling organizations were considered. It is shown that in conditions of high-level corruption repressive anti-corruption measures can lead to expansion of imitation of anti-corruption activity.

  12. Evolving inspection technologies for reliable condition assessment of components and plants

    International Nuclear Information System (INIS)

    Baldev Raj

    1994-01-01

    Condition assessment of components and plants are being done regularly in many an industry. The methodologies adopted are being continuously refined. However, each of these methodologies are being applied in isolation, without realizing the synergistic advantage we derive when a global approach is taken for condition assessment. Developments in a variety of fields, that have a definite bearing on the reliability of condition assessment, are not applied (or even thought that they could be applied) together. The possible impact of evolving technologies in enhancing the efficiency of condition assessment of components and plants are discussed. (author). 11 refs

  13. The Metal-Halide Lamp Under Varying Gravity Conditions Measured by Emission and Laser Absorption Spectroscopy

    Science.gov (United States)

    Flikweert, A. J.; Nimalasuriya, T.; Kroesen, G. M. W.; Haverlag, M.; Stoffels, W. W.

    2009-11-01

    Diffusive and convective processes in the metal-halide lamp cause an unwanted axial colour segregation. Convection is induced by gravity. To understand the flow phenomena in the arc discharge lamp it has been investigated under normal laboratory conditions, micro-gravity (ISS and parabolic flights) and hyper-gravity (parabolic flights 2 g, centrifuge 1 g-10 g). The measurement techniques are webcam imaging, and emission and laser absorption spectroscopy. This paper aims to give an overview of the effect of different artificial gravity conditions on the lamp and compares the results from the three measurement techniques.

  14. Assessment of patient empowerment--a systematic review of measures.

    Directory of Open Access Journals (Sweden)

    Paul J Barr

    Full Text Available Patient empowerment has gained considerable importance but uncertainty remains about the best way to define and measure it. The validity of empirical findings depends on the quality of measures used. This systematic review aims to provide an overview of studies assessing psychometric properties of questionnaires purporting to capture patient empowerment, evaluate the methodological quality of these studies and assess the psychometric properties of measures identified.Electronic searches in five databases were combined with reference tracking of included articles. Peer-reviewed articles reporting psychometric testing of empowerment measures for adult patients in French, German, English, Portuguese and Spanish were included. Study characteristics, constructs operationalised and psychometric properties were extracted. The quality of study design, methods and reporting was assessed using the COSMIN checklist. The quality of psychometric properties was assessed using Terwee's 2007 criteria.30 studies on 19 measures were included. Six measures are generic, while 13 were developed for a specific condition (N=4 or specialty (N=9. Most studies tested measures in English (N=17 or Swedish (N=6. Sample sizes of included studies varied from N=35 to N=8261. A range of patient empowerment constructs was operationalised in included measures. These were classified into four domains: patient states, experiences and capacities; patient actions and behaviours; patient self-determination within the healthcare relationship and patient skills development. Quality assessment revealed several flaws in methodological study quality with COSMIN scores mainly fair or poor. The overall quality of psychometric properties of included measures was intermediate to positive. Certain psychometric properties were not tested for most measures.Findings provide a basis from which to develop consensus on a core set of patient empowerment constructs and for further work to develop a

  15. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  16. Opening up openness: a theoretical sort following critical incidents methodology and a meta-analytic investigation of the trait family measures.

    Science.gov (United States)

    Connelly, Brian S; Ones, Deniz S; Davies, Stacy E; Birkland, Adib

    2014-01-01

    Existing taxonomies of Openness's facet structure have produced widely divergent results, and there is limited comprehensive empirical evidence about how Openness-related scales on existing personality inventories align within the 5-factor framework. In Study 1, we used a critical incidents sorting methodology to identify 11 categories of Openness measures; in Study 2, we meta-analyzed the relationships of these categories with global markers of the Big Five traits (utilizing data from 106 samples with a total sample size of N = 35,886). Our results identified 4 true facets of Openness: aestheticism, openness to sensations, nontraditionalism, and introspection. Measures of these facets were unadulterated by variance from other Big Five traits. Many traits frequently conceptualized as facets of Openness (e.g., innovation/creativity, variety-seeking, and tolerance) emerged as trait compounds that, although related to Openness, are also dependent on other Big Five traits. We discuss how Openness should be conceptualized, measured, and studied in light of the empirically based, refined taxonomy emerging from this research.

  17. Defining and Measuring Chronic Conditions

    Centers for Disease Control (CDC) Podcasts

    This podcast is an interview with Dr. Anand Parekh, U.S. Department of Health and Human Services Deputy Assistant Secretary for Health, and Dr. Samuel Posner, Preventing Chronic Disease Editor in Chief, about the definition and burden of multiple chronic conditions in the United States.

  18. Model-based failure detection for cylindrical shells from noisy vibration measurements.

    Science.gov (United States)

    Candy, J V; Fisher, K A; Guidry, B L; Chambers, D H

    2014-12-01

    Model-based processing is a theoretically sound methodology to address difficult objectives in complex physical problems involving multi-channel sensor measurement systems. It involves the incorporation of analytical models of both physical phenomenology (complex vibrating structures, noisy operating environment, etc.) and the measurement processes (sensor networks and including noise) into the processor to extract the desired information. In this paper, a model-based methodology is developed to accomplish the task of online failure monitoring of a vibrating cylindrical shell externally excited by controlled excitations. A model-based processor is formulated to monitor system performance and detect potential failure conditions. The objective of this paper is to develop a real-time, model-based monitoring scheme for online diagnostics in a representative structural vibrational system based on controlled experimental data.

  19. Event-related potential components as measures of aversive conditioning in humans.

    Science.gov (United States)

    Bacigalupo, Felix; Luck, Steven J

    2018-04-01

    For more than 60 years, the gold standard for assessing aversive conditioning in humans has been the skin conductance response (SCR), which arises from the activation of the peripheral nervous system. Although the SCR has been proven useful, it has some properties that impact the kinds of questions it can be used to answer. In particular, the SCR is slow, reaching a peak 4-5 s after stimulus onset, and it decreases in amplitude after a few trials (habituation). The present study asked whether the late positive potential (LPP) of the ERP waveform could be a useful complementary method for assessing aversive conditioning in humans. The SCR and LPP were measured in an aversive conditioning paradigm consisting of three blocks in which one color was paired with a loud noise (CS+) and other colors were not paired with the noise (CS-). Participants also reported the perceived likelihood of being exposed to the noise for each color. Both SCR and LPP were significantly larger on CS+ trials than on CS- trials. However, SCR decreased steeply after the first conditioning block, whereas LPP and self-reports were stable over blocks. These results indicate that the LPP can be used to assess aversive conditioning and has several useful properties: (a) it is a direct response of the central nervous system, (b) it is fast, with an onset latency of 300 ms, (c) it does not habituate over time. © 2017 Society for Psychophysiological Research.

  20. Methodology for evaluation of alternative technologies applied to nuclear fuel reprocessing

    International Nuclear Information System (INIS)

    Selvaduray, G.S.; Goldstein, M.K.; Anderson, R.N.

    1977-07-01

    An analytic methodology has been developed to compare the performance of various nuclear fuel reprocessing techniques for advanced fuel cycle applications including low proliferation risk systems. The need to identify and to compare those processes, which have the versatility to handle the variety of fuel types expected to be in use in the next century, is becoming increasingly imperative. This methodology allows processes in any stage of development to be compared and to assess the effect of changing external conditions on the process

  1. Discriminant content validity: a quantitative methodology for assessing content of theory-based measures, with illustrative applications.

    Science.gov (United States)

    Johnston, Marie; Dixon, Diane; Hart, Jo; Glidewell, Liz; Schröder, Carin; Pollard, Beth

    2014-05-01

    In studies involving theoretical constructs, it is important that measures have good content validity and that there is not contamination of measures by content from other constructs. While reliability and construct validity are routinely reported, to date, there has not been a satisfactory, transparent, and systematic method of assessing and reporting content validity. In this paper, we describe a methodology of discriminant content validity (DCV) and illustrate its application in three studies. Discriminant content validity involves six steps: construct definition, item selection, judge identification, judgement format, single-sample test of content validity, and assessment of discriminant items. In three studies, these steps were applied to a measure of illness perceptions (IPQ-R) and control cognitions. The IPQ-R performed well with most items being purely related to their target construct, although timeline and consequences had small problems. By contrast, the study of control cognitions identified problems in measuring constructs independently. In the final study, direct estimation response formats for theory of planned behaviour constructs were found to have as good DCV as Likert format. The DCV method allowed quantitative assessment of each item and can therefore inform the content validity of the measures assessed. The methods can be applied to assess content validity before or after collecting data to select the appropriate items to measure theoretical constructs. Further, the data reported for each item in Appendix S1 can be used in item or measure selection. Statement of contribution What is already known on this subject? There are agreed methods of assessing and reporting construct validity of measures of theoretical constructs, but not their content validity. Content validity is rarely reported in a systematic and transparent manner. What does this study add? The paper proposes discriminant content validity (DCV), a systematic and transparent method

  2. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    Science.gov (United States)

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A Generalizable Methodology for Quantifying User Satisfaction

    Science.gov (United States)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  4. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fujii-e, Y.; Kozawa, Y.; Namba, C.

    1987-03-01

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  5. Accurate measurement of junctional conductance between electrically coupled cells with dual whole-cell voltage-clamp under conditions of high series resistance.

    Science.gov (United States)

    Hartveit, Espen; Veruki, Margaret Lin

    2010-03-15

    Accurate measurement of the junctional conductance (G(j)) between electrically coupled cells can provide important information about the functional properties of coupling. With the development of tight-seal, whole-cell recording, it became possible to use dual, single-electrode voltage-clamp recording from pairs of small cells to measure G(j). Experiments that require reduced perturbation of the intracellular environment can be performed with high-resistance pipettes or the perforated-patch technique, but an accompanying increase in series resistance (R(s)) compromises voltage-clamp control and reduces the accuracy of G(j) measurements. Here, we present a detailed analysis of methodologies available for accurate determination of steady-state G(j) and related parameters under conditions of high R(s), using continuous or discontinuous single-electrode voltage-clamp (CSEVC or DSEVC) amplifiers to quantify the parameters of different equivalent electrical circuit model cells. Both types of amplifiers can provide accurate measurements of G(j), with errors less than 5% for a wide range of R(s) and G(j) values. However, CSEVC amplifiers need to be combined with R(s)-compensation or mathematical correction for the effects of nonzero R(s) and finite membrane resistance (R(m)). R(s)-compensation is difficult for higher values of R(s) and leads to instability that can damage the recorded cells. Mathematical correction for R(s) and R(m) yields highly accurate results, but depends on accurate estimates of R(s) throughout an experiment. DSEVC amplifiers display very accurate measurements over a larger range of R(s) values than CSEVC amplifiers and have the advantage that knowledge of R(s) is unnecessary, suggesting that they are preferable for long-duration experiments and/or recordings with high R(s). Copyright (c) 2009 Elsevier B.V. All rights reserved.

  6. Measuring sporadic gastrointestinal illness associated with drinking water - an overview of methodologies.

    Science.gov (United States)

    Bylund, John; Toljander, Jonas; Lysén, Maria; Rasti, Niloofar; Engqvist, Jannes; Simonsson, Magnus

    2017-06-01

    There is an increasing awareness that drinking water contributes to sporadic gastrointestinal illness (GI) in high income countries of the northern hemisphere. A literature search was conducted in order to review: (1) methods used for investigating the effects of public drinking water on GI; (2) evidence of possible dose-response relationship between sporadic GI and drinking water consumption; and (3) association between sporadic GI and factors affecting drinking water quality. Seventy-four articles were selected, key findings and information gaps were identified. In-home intervention studies have only been conducted in areas using surface water sources and intervention studies in communities supplied by ground water are therefore needed. Community-wide intervention studies may constitute a cost-effective alternative to in-home intervention studies. Proxy data that correlate with GI in the community can be used for detecting changes in the incidence of GI. Proxy data can, however, not be used for measuring the prevalence of illness. Local conditions affecting water safety may vary greatly, making direct comparisons between studies difficult unless sufficient knowledge about these conditions is acquired. Drinking water in high-income countries contributes to endemic levels of GI and there are public health benefits for further improvements of drinking water safety.

  7. Measurability of quantum fields and the energy-time uncertainty relation

    International Nuclear Information System (INIS)

    Mensky, Mikhail B

    2011-01-01

    Quantum restrictions on the measurability of an electromagnetic field strength and their relevance to the energy-time uncertainty relation are considered. The minimum errors in measuring electromagnetic field strengths, as they were estimated by the author (1988) in the framework of the phenomenological method of restricted path integral (RPI), are compared with the analogous estimates found by Landau and Peierls (1931) and by Bohr and Rosenfeld (1933) with the help of certain measurement setups. RPI-based restrictions, including those of Landau and Peierls as a special case, hold for any measuring schemes meeting the strict definition of measurement. Their fundamental nature is confirmed by the fact that their associated field detectability condition has the form of the energy-time uncertainty relation. The weaker restrictions suggested by Bohr and Rosenfeld rely on an extended definition of measurement. The energy-time uncertainty relation, which is the condition for the electromagnetic field to be detectable, is applied to the analysis of how the near-field scanning microscope works. (methodological notes)

  8. [Contemporary possibilities of intraocular pressure measurement].

    Science.gov (United States)

    Hornová, J; Baxant, A

    2013-10-01

    Authors introduced current possibilities of measuring intraocular pressure (IOP). A list of available methods of monitoring IOP is published; contact measurement method IOP directly on the cornea, but also over upper lid, methodology of minimal contact and non-contact measurement. Following contact methods are described; former measurements of IOP by impression Schiotz tonometer and the current methodology applanation. So far as the gold standard measurement Goldmann applanation tonometer (GAT) is considered, another methodology with applanation measurements are compared: Pascal dynamic contoured tonometer (DCT ), BioResonator - resonant applanation tonometer (ART ), digital applanation tonometer Tonopen and last hit: continuous measurement of IOP by Sensimed Triggerfish. Orientation and rapid assessment is palpation pressure control over the lid and measuring by tonometer Diaton. Rebound tonometer (RBT) iCare belongs to measurements with minimal contact, no need anesthetic drops and fluorescein, therefore a self - home version of IOP measurements (Icare ONE) is developed. Non-contact measurement of IOP by different pneumotonometers is popular for screening assessment of IOP. Reichert Ocular Response Analyzer (ORA) is a non-contact applanation IOP measurement and reveals additional properties of the cornea. In the discussion of a range methodology is evaluated, the experience of other authors and their own experience is compared. For monitoring of patients is necessary to select the most suitable methodology, measure repeatedly and accurately to allow long-term monitoring of intraocular pressure.

  9. Evaluation of constraint methodologies applied to a shallow-flaw cruciform bend specimen tested under biaxial loading conditions

    International Nuclear Information System (INIS)

    Bass, B.R.; McAfee, W.J.; Williams, P.T.; Pennell, W.E.

    1998-01-01

    A technology to determine shallow-flaw fracture toughness of reactor pressure vessel (RPV) steels is being developed for application to the safety assessment of RPVs containing postulated shallow surface flaws. Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shallow surface flaws. The cruciform beam specimens were developed at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far-field. out-of-plane biaxial stress component in the test section that approximates the nonlinear stresses resulting from pressurized-thermal-shock or pressure-temperature loading of an RPV. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. The cruciform fracture toughness data were used to evaluate fracture methodologies for predicting the observed effects of biaxial loading on shallow-flaw fracture toughness. Initial emphasis was placed on assessment of stress-based methodologies. namely, the J-Q formulation, the Dodds-Anderson toughness scaling model, and the Weibull approach. Applications of these methodologies based on the hydrostatic stress fracture criterion indicated an effect of loading-biaxiality on fracture toughness, the conventional maximum principal stress criterion indicated no effect

  10. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    Science.gov (United States)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  11. Improving process methodology for measuring plutonium burden in human urine using fission track analysis

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Slaughter, D.M.

    1998-01-01

    The aim of this paper is to clearly define the chemical and nuclear principles governing Fission Track Analysis (FTA) to determine environmental levels of 239 Pu in urine. The paper also addresses deficiencies in FTA methodology and introduces improvements to make FTA a more reliable research tool. Our refined methodology, described herein, includes a chemically-induced precipitation phase, followed by anion exchange chromatography and employs a chemical tracer, 236 Pu. We have been able to establish an inverse correlation between Pu recovery and sample volume and our data confirms that increases in sample volume do not result in higher accuracy or lower detection limits. We conclude that in subsequent studies, samples should be limited to approximately two liters. The Pu detection limit for a sample of this volume is 2.8 μBq/l. (author)

  12. Methodological spot of establishing silt deposit concentration in Serbian rivers

    Directory of Open Access Journals (Sweden)

    Dragićević Slavoljub

    2007-01-01

    Full Text Available Recent methodology of sampling and establishing silt deposit concentration in Serbian rivers is associated to numerous deficiencies. Daily concentrations of this type of river deposit on the most of the hydrological gauges were obtained on the base of only one measurement, which takes into consideration the matter of representative ness of those samples. Taking the samples of deposit in one point on the profile is little bit problematic because of dispersion of the obtained results. Very important matter is the question of choice of the sampling location. This analyses of data may lead to serious spots in calculating total carried deposit. From the above mentioned reasons, we decided to take precise measurements of silt deposit concentration as well as to establish methodological spots of measurements. The results of these measurements are analyzed and presented in this paper.

  13. Measurement and modeling the coefficient of restitution of char particles under simulated entrained flow gasifier conditions

    Science.gov (United States)

    Gibson, LaTosha M.

    predict the coefficient of restitution (COR) which is the ratio of the rebound velocity to the impacting velocity (which is a necessary boundary condition for Discrete Phase Models). However, particle-wall impact models do not use actual geometries of char particles and motion of char particles due to gasifier operating conditions. This work attempts to include the surface geometry and rotation of the particles. To meet the objectives of this work, the general methodology used for this work involved (1) determining the likelihood of particle becoming entrapped, (2) assessing the limitations of particle-wall impact models for the COR through cold flow experiments in order to adapt them to the non-ideal conditions (surface and particle geometry) within a gasifier, (3) determining how to account for the influence of the carbon and the ash composition in the determination of the sticking probability of size fractions and specific gravities within a PSD and within the scope of particle wall impact models, and (4) using a methodology that quantifies the sticking probability (albeit a criterion or parameter) to predict the partitioning of a PSD into slag and flyash based on the proximate analysis. In this study, through sensitivity analysis the scenario for particle becoming entrapped within a slag layer was ruled out. Cold flow educator experiments were performed to measure the COR. Results showed a variation in the coefficient of restitution as a function of rebound angle due rotation of particles from the educator prior to impact. The particles were then simply dropped in "drop" experiments (without educator) to determine the influence of sphericity on particle rotation and therefore, the coefficient of restitution. The results showed that in addition to surface irregularities, the particle shape and orientation of the particle prior to impacting the target surface contributed to this variation of the coefficient of restitution as a function of rebounding angle. Oblique

  14. Implied Volatility Surface: Construction Methodologies and Characteristics

    OpenAIRE

    Cristian Homescu

    2011-01-01

    The implied volatility surface (IVS) is a fundamental building block in computational finance. We provide a survey of methodologies for constructing such surfaces. We also discuss various topics which can influence the successful construction of IVS in practice: arbitrage-free conditions in both strike and time, how to perform extrapolation outside the core region, choice of calibrating functional and selection of numerical optimization algorithms, volatility surface dynamics and asymptotics.

  15. Methodology for plastic fracture - a progress report

    International Nuclear Information System (INIS)

    Wilkinson, J.P.D.; Smith, R.E.E.

    1977-01-01

    This paper describes the progress of a study to develop a methodology for plastic fracture. Such a fracture mechanics methodology, having application in the plastic region, is required to assess the margin of safety inherent in nuclear reactor pressure vessels. The initiation and growth of flaws in pressure vessels under overload conditions is distinguished by a number of unique features, such as large scale yielding, three-dimensional structural and flaw configurations, and failure instabilities that may be controlled by either toughness or plastic flow. In order to develop a broadly applicable methodology of plastic fracture, these features require the following analytical and experimental studies: development of criteria for crack initiation and growth under large scale yielding; the use of the finite element method to describe elastic-plastic behaviour of both the structure and the crack tip region; and extensive experimental studies on laboratory scale and large scale specimens, which attempt to reproduce the pertinent plastic flow and crack growth phenomena. This discussion centers on progress to date on the selection, through analysis and laboratory experiments, of viable criteria for crack initiation and growth during plastic fracture. (Auth.)

  16. Advantages of Westinghouse BWR control rod drop accidents methodology utilizing integrated POLCA-T code

    International Nuclear Information System (INIS)

    Panayotov, Dobromir

    2008-01-01

    The paper focuses on the activities pursued by Westinghouse in the development and licensing of POLCA-T code Control Rod Drop Accident (CRDA) Methodology. The comprehensive CRDA methodology that utilizes PHOENIX4/POLCA7/POLCA-T calculation chain foresees complete cycle-specific analysis. The methodology consists of determination of candidates of control rods (CR) that could cause a significant reactivity excursion if dropped throughout the entire fuel cycle, selection of limiting initial conditions for CRDA transient simulation and transient simulation itself. The Westinghouse methodology utilizes state-of-the-art methods. Unnecessary conservatisms in the methodology have been avoided to allow the accurate prediction of margin to design bases. This is mainly achieved by using the POLCA-T code for dynamic CRDA evaluations. The code belongs to the same calculation chain that is used for core design. Thus the very same reactor, core, cycle and fuel data base is used. This allows also reducing the uncertainties of input data and parameters that determine the energy deposition in the fuel. Uncertainty treatment, very selective use of conservatisms, selection of the initial conditions for limiting case analyses, incorporation into POLCA-T code models of the licensed fuel performance code are also among the means of performing realistic CRDA transient analyses. (author)

  17. Optimisation of the measurement protocols of 129I and 129I/127I. Methodology establishment for the measurement in environmental matrices

    International Nuclear Information System (INIS)

    Frechou, C.

    2000-01-01

    129 I, is a natural long-lived isotope, with a half-life of 15,7 million years, also artificially produced in nuclear power plant. It is then released in the liquid and gaseous effluents of the nuclear fuel reprocessing plants. 129 I is integrated in all biological compartments at different activity levels, depending on their distance from the emission source and their ability to metabolize iodine. Performances of the different 129 I and 129 I/ 127 I measurement techniques available: Radiochemical Neutron Activation Analysis, Accelerator Mass Spectrometry, direct γ-X spectrometry and liquid scintillation were evaluated. Associated radiochemical preparation steps of the two first techniques were optimized and adapted to the characteristics of the major environmental matrices. In a first step, the radiochemical protocols were developed and validated. In a second step, intercomparison exercises have been lead on various environmental samples presenting different 129 I activity levels. They showed the good agreement between the results given by the three techniques on different environmental matrices with activities between 0,2 and 200 Bq.kg -1 dry weight. As a conclusion, a methodology for the measurement of 129 I and 129 I/ 127 I ratio in environmental samples is proposed. It includes a decisional diagram taking into account the characteristics of the matrices, the detection limits and the answer delay. A study on the losses of 129 I during the calcination of an algae was lead by direct γ-X spectrometry and application studies were made to measure 129 I levels in different biological compartments issued from various locations: 129 I activity interspecific variation in different species of seaweeds from the French channel coast under the relative influence of La Hague, 129 I levels in bovine thyroids from the Cotentin area and 129 I in vegetal samples collected around the nuclear reprocessing plant of Marcoule. (author)

  18. Just Research in Contentious Times: Widening the Methodological Imagination

    Science.gov (United States)

    Fine, Michelle

    2017-01-01

    In this intensely powerful and personal new text, Michelle Fine widens the methodological imagination for students, educators, scholars, and researchers interested in crafting research with communities. Fine shares her struggles over the course of 30 years to translate research into policy and practice that can enhance the human condition and…

  19. A Clustering Methodology of Web Log Data for Learning Management Systems

    Science.gov (United States)

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  20. Laboratory Test Methods to Determine the Degradation of Plastics in Marine Environmental Conditions

    OpenAIRE

    Tosin, Maurizio; Weber, Miriam; Siotto, Michela; Lott, Christian; Degli Innocenti, Francesco

    2012-01-01

    In this technology report, three test methods were developed to characterize the degradation of plastic in marine environment. The aim was to outline a test methodology to measure the physical and biological degradation in different habitats where plastic waste can deposit when littered in the sea. Previously, research has focused mainly on the conditions encountered by plastic items when floating in the sea water (pelagic domain). However, this is just one of the possible habitats that plast...